Wednesday, February 2, 2011

Linkscape is faster, link analysis is improved and other goodies

Yea.. Kinda like this.

Over the past few months our dev and product teams have been busy little bees working on some pretty exciting new enhancements and upgrades to the web app. Since it's the holiday season... we might even liken them to Santa's little elves building toys for all the good little SEOs around the world.


We've been rolling out quite a few updates each week. Some you've probably noticed, like the Linkscape index being updated faster and faster (Three cheers to Chas and Bryce for that!), while others may have gone unnoticed like fixing some pagination issues. I wanted to take this time to talk about a few of the bigger items we've released recently (even today!) and a few features you should watch for in the future.


As I mentioned above we're all in a tizzy over here about getting the Linkscape index updated quicker. Bryce and Chas on the dev team have made some major improvements to speed, so be sure to thank them for your fresh new data. :)


We just had the latest Linkscape update yesterday, check out the stats for index 35:

38,807,464,322 (38.8 Billion) Pages360,354,116 (360 Million) Subdomains107,159,213 (107 Million) Root Domains393,701,245,290 (393 Billion) Links2.11% of All Links are Nofollowed (up 1.4% since early December)
57.01% are internal (down from 57.66% in early December)42.99% are external (down from 43.34% in early December)6.51% of pages have rel=canonical62.02 links/page on average

Remember when you used to click on the Link Analysis tab and it would look like this:


Well today we're launching some amazing updates that will help you see in an instant how you're doing against your competitors. As an example, I have a campaign set up for my husband's photography site. Here's what the new Link Analysis tab looks like:


You can visually see that the Photography for Real Estate site is obviously the larger domain of the 4 and clearly has more domain authority, mozTrust, etc. The interesting thing to note is that in local markets it doesn't always rank first. :) So, beyond this quick information about your site and each of the competitors, you can also do deeper analysis into the links:


 


In this view I'm looking at the Linking Root Domains, similar to what you find in Open Site Explorer. But you can also see followed backlinks, top pages and anchor text information as well. I have only showed my site and one competitor in the screenshot but on the page you'll see all the competitors. You can quickly take a peek into all your competitor's backlinks on this one page!


Plus, you can easily download a PDF report and/or export up to 10,000 links (or top pages, linking domains, anchor text) into a CSV file, all with the click of a button. Whee... Data FTW!


Note: This is being launched today... so if you don't see it quite yet, don't get discouraged! Take a break, then come back and try again. :D


Another Note: The top linking root domain to my husband's site is from seomoz.org because I linked to it from my profile. :)


If you're using the web app, you've probably already received one of these sexy new Keyword Ranking Report emails. Now, you can quickly assess how well your keywords have been doing over the past week. Check out a sample email below:


Now if that email doesn't send you into automatic geek email heaven... then heck I don't know what would. :D


You may have noticed... or have even been alarmed to see that your mozPoints are no longer showing up in the right nav of the blog. Plus the Top User page has been a bit, shall we say, vacant lately. Well please don't worry! Your mozPoints have not gone away, and they are still showing up properly on your profile page. We are making some enhancements to the mozPoint system which will make more sense when we release our new profiles early next year. We apologize that we didn't let everyone know ahead of time before we made the changes! We received quite a few tweets, DMs, PMs, emails and support tickets asking where their mozPoints, rankings and Top User information went.


The bad news, you'll have to wait a couple weeks to see that information again on the right nav, but the good news is that there are exciting improvements coming soon! I know, I know, you all are going to ask to get more information about these improvements... I promise, all in good time. :) We have to keep some surprises around here!


This is a super fun one. We've recently created a whole new About Us section that includes more information on how to contact us, more mozzers and our TAGFEE tenets, SEOmoz job openings, cool press and awards we've received, and upcoming SEOmoz events (where we're speaking + meetups and such).


Firefox and Chrome toolbar update, with some fixes and position numbers added to the SERP overlayImprovement to the rankings overview/history to display previous ranking data during retrieval of new rankingsNew holiday Roger on the home page (can I get a w00t w00t!)Manual selection of URLs for On-Page reportsBetter canonicalization check at setup – will check for redirects as well as missing redirectsRepaired the ailing Juicy Link Finder


Plus, we've recently added the PRO Feature Change Log which you can access at any time to stay up-to-date on what's coming up next, or what changes have been recently implemented. Plus we've added a "request a feature" button to this page and hope that you'll utilize it! This is the best way to get that feature you've been dying to see in the web app.


Starting in January we have so many amazing new developments we're bursting at the seams over here! We hope you all have a very happy holiday season and can't wait to get the ball rolling in 2011. Be there or be square. 

About jennita — Jen Lopez is the Community Manager at SEOmoz and a devotee of the fine arts of Twitter, Facebook and all things social media. She has a background in web development and will always be an SEO at heart.

View the original article here

Coming Soon(ish) upgrades:)

I already have this to our subscribers and affiliates, but we are last, the ability to subscribe to our membership site so we can perform upgrades.

We want to update Drupal, start the new site design, and the membership management software from what has become a platform that is prepackaged sorta big and hairy, switch to more manageable. Doing this will we allow to accept payments on the ground, a few different tiers provide access, allow me to segment that to support some aspects of the customer to the account management easier to some of employees do things.

The ultimate goal of this upgrade is the site look more modern and cohesion, and make it so that we are more our resources on creating new content and tools and less on managing the underlying software & such expenditure.

If we membership site moved a few years ago, I appreciate the level, the success that it would reach & I didn't realize, as some smaller bugs grew bigger problems than our site would be. Most of these errors have been fixed, but there are a few spirits and we are to create, sorta by spending on the wheel wheels instead of buying and then layering limited more value on top. :)

I wants to be involved in the site daily, but it really makes sense for my health (and mind), Division of labour on some of the administrative stuff instead of me trying every aspect of the use to manage all burning out. Our staff are great and now we need only to implement systems that help be greater(er). : D

We want to back up sometime in mid-January to open. Blog and Web site are still, but given the number of databases that currently has the website & synchronized as with PayPal it is probably best to off new subscriptions to close while we change things around.

We could try to turn things while keeping active everything, but the big question that is, if any strange anomalies, then that's probably more stress than I would care to cope with. I love the site and I want to keep this way (vs pull my hair out due to put too much stress on me). : D

When you are notified if we want to reopen, please comment on this post and we will all who commented once everything from relaunched under the new system and tested by e-Mail. :)

Hope that the holidays are going good for you and more to come if we make some significant progress with these changes.


View the original article here

Tuesday, February 1, 2011

Few brave SEOs conquered 'Movember'

 Unfortunately kept the rest of the Office (and their respective wives) seven SEO.com's best talent from the bite of a razor blade by Movember November - with the exception of Christian your upper lips. He is the man on the right with the weak sauce ' stache he had to shave family last week for some photo. Seriously, priorities...

As well, the bold and brave souls for your mustaches to you for four weeks makes called and made it even over Thanksgiving, without losing a Turkey leg within their grisly nasty facial hair.


Nathan Blair (second photo below) won the Movember competition with his oily black handle bars. Cheers you Nathan Blair, mustaches all over the world are proud.


View the original article here

Oh, the opportunity

I do keyword research.


The opportunities I see before me still surprise me.


Keyword lists, displays the frequency of searches, market research Nirvana. It is like a God, immerse yourself in the minds of the mortal.


And most people outside SEO. Yet. Don't. Get. It.


Is ever declared empty keywords to the people, and received in return?


We can thousands of niche opportunities through a keyword research tool and a list of trawl. Demand is on the display. It is broadcast to us.


As soon as we discover demand we measure the competition, quantify the possibility, a site to build and diving in the demand streams that have existed long before the Internet was invented.


Demand to meet offer.


Look at all this requirement:

"Japanese Japanese translation" 450,000 monthly searches "Hospital of jobs" 823,000 monthly searches "forklift certification" 27,100 monthly searches "Address labels" 301,000 monthly searches "digital signage" 201,000 monthly searches "student credit cards" 135,000 monthly searches "coin collecting" 60,500 monthly searches

And as we know, this is only one keyword per niche. The real gems can be found deep in the long tail of groups, permeation and similarities.


The search channel still amazes me.


It is so powerful and so under valued.


Have you seen mad men?


If you haven't, it is set a great show about an advertising agency in the 1960s. The ad executives were the rock stars of the time good to know, what's on the minds of consumers was paid.


What would have made you a keyword research tool, question I me?


Or our digital Zeitgeist?


And in contrast to fifty years ago, there are fewer barriers to access to many traditional markets. In the past was required nation wide, or international, a large, multinational machine, people and capital to compete. Now with a credit card, we just in touch a vast network, in an instant.


Fifty years ago, was a book publishing, difficult and expensive. A major Publisher on large retail shelf space could get, but could not. Probably still not. You need, print many copies, a risk and expense out of reach of most people. A Publisher could reach reviews and the publicity machines to work.


Now we can compete.


We can far more coverage in in much less time, for a fraction of the cost get.


So many niche, so little time.


So, what do you do today?



View the original article here

Highlights of Pubcon 2010 social media Keynote Panel

 Day 2 of the PubCon 2010 began a knockout Keynote Panel with Sarah Evans, Chris Brogan, Brian Clark and Scott Stratten. With back and forth from all participants, I'll summarize the General topics of discussion. Here are the highlights:

Is social media A fad?(Gotta start somewhere)


No.


Get internal buy-in on social media


You can use the question "What kind of metrics can we move?" instead of just saying "We need to make videos."


You can refer to the fact the talks regardless of %s.keeping is a choice if you participate.


Use of social media the right way is very effective because it is driven by word of mouth and personal conviction.It is one of the most effective channels.


Social media planning


The social media plan should be in the context of the entire marketing plan with a part of the budget to social media and the same cohesion and accountability than any other marketing channel created.


If you hate people, you should not have social media.People have for a long time error machen.Soziale media doing these errors more public. make sure you have the right people behind the plan and execution.


Social Media vs SEO


Social media and SEO are on a Kontinuum.Einige of the smartest people in the social media are SEOs because you know about the importance of content and distribution.


There is no competition between the Kanälen.Ob is, social media, SEO, PPC, etc that depends on amount of time to spend on the channels of the effectiveness of Kanäle.Es is different for every company.


Social media error


You will find a good mix of active and passive Einkommen.Viele in social media fail because you've created the social media revenue Foundation and business requires model "for everything it" and it is impossible for a person in this way to keep up.



View the original article here

Monday, January 31, 2011

Search engine optimization: Know before you go

Many companies choose a whim out SEO plans affecting jump, without really understanding. While SEO is always more and more critical to a successful business model, there are many things that must be considered before moving forward. Below are a few points to the juices flowing.


Seems like a simple thing, but you need to know how SEO fits into the overall marketing objectives. What is reach? What do you want to it to achieve? SEO do not only be because everyone else is doing it. Do it for a reason.


View the original article here

November Linkscape index update live (and new Linkscape WordPress plugin)

This evening our 33rd Linkscape update started. New link data, see the open site Explorer MozBar Linkscape classic, the link intersect tool, SEOmoz API.


E.g. link data in OSE has this post now mid-October


This update provides these problems data crawled by mid-October until the beginning of November - unfortunately was processing from Amazon (weird, EC2 machines have available, as a "Christmas time rush") .we are delayed deliberately and will be taking measures to take to ensure that in December index update goes smoothly.


Statistics for this index:

40,605,301,071 (40 Billion) Pages425, 695, 258 (425 million) Subdomains103, 776, 906 (103 million) root Domains395, 851, 127, 399 Links2.10% (395 billion) of all links are Nofollowed (up to 0.06% of October) 56.99% are internal (external are down the October43.01% 57.15% (42,85% in October)5.88% pages have canonical rel (up from 5.42% in October) 62.28 links page on average (down from 62.35 in October)))

I'm also excited to say we now have the beginnings of a WordPress plugin powered by Linkscap emerging e verfügbar.Es today are only a few features, but we want your help to tell us what would be valuable to have WordPress tool useful and interesting, in that.


Still in alpha stage, but result links + top pages in the admin panel


The plugin was built only Free SEOmoz API is verwenden.Dies a very early release, and it can still be some bugs, but if you have feature ideas or suggestions, please leave in the comments!


View the original article here

Google keyword tools keep better first:)

If Google have appeared on Your new keyword tool that many advertisers checked, from how it was not very granular & focused more latitude and presumptuous. Allowing it in the default you drill down in a specific area to run, provided, that you wanted a broader basket of keywords than you requested to buy especially does not make sense when you think about how quality score view irrelevant punished.


Based on user feedback / complaints, you updated the tool to offer 3 different filters: more like this, include or exclude keywords and a setting which search optional stricter when turned on.


Given keyword categorization, localization, trend data, match type options, these new filters, handy CSV export options and all the data you provide is always quite a great tool with a variety of unique applications for market research. It is so efficient that can a lot of work, in a few minutes, but it is so addictive, you can spend hours playing with it. :)


Unfortunately, Google was upped on this front - Google recently a! ;)


Google built before recently announced a new keyword tool that estimate database size different global markets. The regular keyword tool can do this also, but this new keyword tool allows you to market size (by search demand) side by side at a glance to compare and lists relevant related of local keywords in another language that roughly have the same meaning. Awesome stuff Google!


View the original article here

Beyond good and evil in SEO

I recently heard a story about a local SEO, whose Kunden overnight, almost all lost in the Google ranking shop. Apparently the shop was engaging in "Black Hat" techniques. I'm pretty sure that the narrator of the story made to help a "tsk TSK" sound at the end of this stress as another instance where, the evil get your only desserts.But I think it is a basic disjoint in even use of the term "Black Hat" - as it derives a big gap in morality. And black hats aren't going to steal the savings of pensioners, kill kittens or other malicious acts to commit. What you do is the system.In Italy play, there's a word for when someone using audacity, gain advantage: "Furbo".  And it is in such a way as a virtue.  In many cultures, it is not a bad thing at all to a "player".  How is it that in my own channel of SEO is speaking experts, black hat with such contempt? Mind you, I'm not advocating black hat SEO tactics - how in will explain a bit more. But by making black and white we could Hat discussion in terms of morality, distort the true meaning behind the concepts.What happened in Black HatWhat happened in black hat is the gaming or attempting gaming system.  And in this case, the system is usually Google.  Now, we hope Google wants to present the best search results to its users.  I'm not always sure and not sure that the reality that Google wants to get the best results to its users, Google does not make the most money helps present.  When we go to discuss ethics, there is a whole topic for discussion in that.In, a path as a professional SEO, what we want to do is get to see Google to our site as more relevant to a topic than our competitors page.  And who is to say, isn't it?  Google, as we know, has an algorithm, albeit a secret algorithm - and we all can find countless examples where Google provides the best results.  So, we ask, Google is a better arbiter of the best results? Let's look at a black hat tactics, I heard recently discussed at SMX East, taking an old well indexed site and then drag with back links to our Web site.  Were not part of this link's original content on the site - and perhaps not even relevant for the content which is old well indexed site - Black Hat so essentially, Google's system sort the nature of a little.By and use cheat way; There are some interesting studies (http://blog.ted.com/2009/03/13/dan_ariely_offe/) showing that are scamming people MOST in small incremental ways, if you believe that may go unnoticed.  It's just a little trick.  In game theory, this is an aspect of the "tragedy of the Commons" - people will take a little from the group, if a lot but only hurt your actions the group a little benefit.  The problem is, of course, that all that hurts little up.Now add to the group, it is true that highlights a key phrase in meta tags, H1's, links, etc., not games is ALSO the system? Maybe it's the system but play less than as purchasing a storage site. But when we talk about ethics, we can't really talk about degree of ethics and where we small measures result?Mrs GoogleImagine a classroom where the teacher at the front is Mrs Google and all students are SEO's.  Mrs Google asks a question "Who is here to answer the relevant student this question"?  And we all raise our hands – each one of us wants to pick your ME ME we notice you want!  And sometimes, noticed for, perhaps we sort bounce up and down in our seats to help affecting your choice.  But should the disciples do bouncing up and down elected most in their seat?  Are you really relevant?  Well, maybe a student of not only bobs up and down in his seat, but sound is a small sound! Is the Chirper a black hat?  Not using a method of getting called on, being its is chosen with the fairness has nothing to do? so; What I suggest is that black hats just your methods others perceived, white hats - and compared to the most SEO professionals is only doing what is done of all, only in greater degree.Beyond good and evilAs SEO I discouraged the use of so-called Black Hat techniques.  Simply put, they can result to be perceived in a Web site as a deserved a Google purgatory.  It would irresponsible at risk a site this way.  There is a risk management issue, no moral issue.The SERP has a long and painful death was dying. Every so often gives a new extension of the Google SERP less relevant - instant search, local results, etc..  In other words, DYNAMIC results based on the searcher's demographics or behavior.  Cool.  And by focusing more on creating content that rich to relevant clouds of words and phrases, we want are perceived not only more relevant, we focus on being more relevant.  It takes the entire discussion outside the Black Hat/white hat.About Ric Dragon

View the original article here

Sunday, January 30, 2011

Why companies made major marketing shift in 2010 (free white paper)

SEO.com releases white paper outlines a radical change in marketing expenditure across the country and identifies the return on investment for search engine marketing.


SALT LAKE CITY - in this evolving digital age marketing has made a significant change. Organizations of all sizes are shifting, advertising and marketing budgets from traditional Search engine optimization and other forms of online marketing strategies.


"If we look at the numbers out there, it is very revolutionary," said Nelson James, President of SEO.com. "What used, the most important strategy for marketers took a back seat."


Forrester Research, said marketers $26 billion in 2010 in Internet marketing, spent the rivals all expenditure on cable / satellite TV and radio. Search engine marketing professional organization (SEMPO) said that nearly half of all companies dropped, expenditure for the traditional areas and invest more in online marketing.


"We invest in SEO for this year, and the payout was enormous," Sarah Huizingh, marketing manager for Spillman technologies, a company said the public security software specialized. Huizingh Spillman said money from the print advertising budget to invest in SEO.


Traditional strategies, which include the biggest hit print and direct mail. A recent SEMPO survey reported that 49 percent of the transferred money from your print advertising budget and it toward search engine optimization services, pay per Click Management and social media marketing place. In the year 2010 36 percent money pulled away from their direct mail budget and 17 until 24 percent of companies, a similar shift from conferences and exhibitions, yellow page view and TV - and radio spots.


So what are the reasons for this change in behavior?


"Really three things matters", said James. "First of all in a bad economy people for marketing solutions that destination search their demographic are better, highly measurable and show how every marketing dollar spent making money."


Strategies such as SEO and PPC customers want to buy at the moment of your choice you. Social media has the potential to engage millions of customers. Analytics, marketers can accurate track where customers are, how you on a site remain, what campaigns bring in the most sales and more.


The average SEO.com clients that for six months or more have done search engine marketing received an average return on investment of almost 2,500 percent – or the equivalent of receive $25 for each $1 spent.


"Online marketing allows companies every penny spent track and is proven to be a really great ROI," James said. "It is probably the main reason why more moving your marketing budget." "As a result, advertising is traditional marketing budget remains always."


For more information, images, graphics and statistics about this shift in the marketing world, and the average return of 2,500 percent read the white paper "shift: from traditional to online search marketing" here: http://www.seo.com/Shift_Whitepaper.pdf


About SEO.com
SEO.com is a SEO firm that provides a large return for its customers by driving traffic to your sites through aggressive search engine optimization, pay-per-click advertising, and social media marketing. SEO.com turns visitors into sales with user friendly design and conversion optimization. Customers range from small start-ups to Fortune 100 companies.



View the original article here

Content marketing optimization session with Lee Odden - PubCon 2010

 If content is searchable, it can be optimized.

What are your clients of content preferences? How do you discover? Consume?Share? create a profile for your target groups.


Use, personas data create tools

Demographic information from Quantcast, CompeteKeyword info from SEMRush, GoogleEngagement info from PostRankSocial network info from Flowtown Rapleaf

Create editorial spreadsheet to schedule all content that includes:

TopicKeywordsMedia TypePlaces repost/repurpose content (newsletters, SlideShare) places to increase (Facebook, Twitter, etc.)

The SEO content cycle

Create and promote optimized content content is perceived, shared, visibility attracts GrowsExposure more subscribers, fans, friends, LinksIncrease links and exposure grows search & referral TrafficTraffic & community provides data that can explore to develop content and SEO social networks continue to grow

Repurposing content example

Upload video to YouTubeEmbed in a blog post with touch mail screen shots show images and text as a story into a PowerPoint or PDF video to FlickrUpload, upload .docstoc, Scribd, etc.

Take away

Develop and optimize content with customer personas in MindThink as a Publisher and create an editorial PlanDevelop channels distribution & social link leverage, both web and social media analysis

View the original article here

Google's missing disclosure

One of the fundamental keys to monetizing third party content is finding a way to do it while keeping your earnings data abstract. A huge problem that hits pure plays like Netflix is that as soon as companies see the profits the cost structures change.

Comcast is looking to get some funds from Level 3 (for distribution of Neflix content), andPartners who license video content to Netflix want a bigger piece of the action as well: "Now many of the companies that make the shows and movies that Netflix delivers to mailboxes, computer screens and televisions — companies whose stocks have not enjoyed the same frothy rise, and whose chief executives have not won the same accolades — are pushing back, arguing that the company is overhyped, and vowing to charge much more to license their content."

Making big money on someone else's content makes the content owner look stupid. As soon as you let big media know you are making money on their content they get pissed and feel they rightfully earned that money. As they sense a shift in power any edge cases become the standard against which all other deals are compared.


If you study Google & listen to their quarterly conference calls you will always come away with the following: YouTube is operating at an amazing scale, Youtube's growth is accelerating, and YouTube might not be profitable. In the most recent quarterly call Google highlighted that their display network was a $2.5 billion business, but we never hear specific revenue or cost stats from YouTube. Hiding that business within the larger Google enterprise allows Google to print money and gain leverage without evoking the wrath of big media.


Sure there is the Viacom lawsuit, but Youtube streams over 2 billion videos a day with roughly 1 in 7 of those views being monetized. The growth trends keep accelerating, with revenues more than doubling each year, but Google doesn't have to deal with the Netflix issue of margin collapse from partners - because they don't break out profits.


Many large scammy/criminal corporations (like the too big to fail bankers & the huge pharma companies) have their 'profits' legislated, even as they destroy the economy. Their political kickbacks to politicians are so strong that in spite of committing multiple repeated felonies, they have enough political sway that third parties create scammy non-profits promoting these brigands to win political favor.


Google claims they are not dominant, but they do not sit in an area where they can legislate their own profits. So they must operate in the gray area elsewhere to sustain and grow their profits.


Have a trademark? Are you not buying your own brand? Don't worry, a competitor will. Prior Google research (and Google sales material) have shown consumer confusion from some of these activities


 


But Google has a great legal team & have managed to grow profits by forcing brands to buy their own existing brand equity, even if it adds 0 revenues & significant costs for the advertiser.


Remember how Google doesn't like cloaking? But they will DRM manage your media for you & if someone views it outside of the appropriate area they will get a "screw you" page, likeso:


(If you are from the US you can see how content is cloaked in various countries by using web proxies or VPN services.)


Is Google a more authoritative book seller than Barnes & Nobles? Other than lying & taking a few legal shortcuts, what puts Google in a superior position as a book seller?


At least their (lack of) respect for copyright is consistent.


Remember back when Google claimed that anyone buying or selling links needed to do it in a way that is both machine readable & human readable? Well, Google invested in Viglinks, which is certainly 100% counter to that spirit. Further, consider Google's recent hard coding of ebook promotions in their search results. There is no ad label in a machine readable or human readable format, but they mix it right in their 'organic' search results.


Remember how paid links were bad?


"Search engine guidelines require machine-readable disclosure of paid links in the same way that consumers online and offline appreciate disclosure of paid relationships (for example, a full-page newspaper ad may be headed by the word 'Advertisement')" - Google.


If you do the same thing Google does, then you are violating their guidelines. Sorta hard to compete with them while staying inside their guidelines then, eh?


If Google expects you to label your paid ads in machine and human readable ways, then why are they fine with blending their ads directly into the organic search results with no disclaimer? Do they actually believe that manipulating end users (to promote their own business deals) is less evil than potentially manipulating a search tool?


The absurdity reminds me of a quote from You Are Not a Gadget:



If you want to know what's really going on in a society or ideology, follow the money. If money is flowing to advertising instead of musicians, journalists, and artists, then a society is more concerned with manipulation than truth or beauty. If the content is worthless, then people will start to become empty-headed and contentless.


The combination of hive mind and advertising has resulted in a new kind of social contract. The basic idea of this contract is that authors, journalists, musicians, and artists are encouraged to treat the fruits of their intellects and imaginations as fragments to be given without pay to the hive mind. Reciprocity takes the form of self-promotion. Culture is to become precisely nothing but advertising.



View the original article here

Saturday, January 29, 2011

Google launches MILLIONS of doorway pages

I mentioned this in our last post, but it deserves a post probably own. ;)


Google has long claimed that search results within search results are a bad user experience. They also claim that the use of your content is fair use, because it only for ranking and distribution purposes.


Take a look at Google's Deskbar subdomain. Google has created millions of pages on this sub domain:
These pages are ranking in search results:
Google's quest, is to be the Web leads to many half done produce products (eHow's content is written at a higher level than Matt Cutts writes) and a growing variety of bugs. This of course create opportunity for some people, but a whole lot of pain for many people, the nothing but trusting Google competent and be fair have done wrong.


I understand ready fire aim on beta testing or things for Start-Ups, but Google should depend such of silliness with a search service millions?


So much of its originality algorithms determine what is the true source on the Internet; at the moment bugs like this appear that trust is clouded and that poured blood sweat and tears into a product with a flip a deskbar.google.com launch wiped out the people can.



View the original article here

5 Creative solutions to tough SEO challenges

December has been a surprisingly busy month for my email inbox, with questions on nearly every SEO subject imaginable. In answering many of these quandries, a common theme emerged - that many marketers engage in SEO challenges with a singular focus on the most common / best practice techniques and don't stray into a creative, imaginative mindset to find alternatives.


Here, then, are six examples of problems I've seen where creativity might prevail over standard techniques.


Several SEOs I know are currently involved in high-budget reputation management, where a company, product or person is attempting to assert control of the search results for their name/brand. Most of the standard techniques involve linking to and/or creating positive or neutral content about the target to push down the negative content.


A creative, alternate methodology might be to create diversity by introducing multiple brands/people with the same or similar names. For example, if a Mr. Thomas Thompson is attempting to push down negative results for his name, you might seek to boost up the profiles/rankings of other Thomas Thompsons and generating buzz about them to make the engines consider applying diversity algorithms to the results. Similarly, you could create fictional profiles (pseudonyms) or characters for the same effect. Hollywood movies, TV shows, short films, authors and actors can even be persuaded through funding or other means to name characters or products a certain way. 


A number of large e-commerce site marketers have experienced considerable challenges getting deep content indexed. The common solutions include optimizing XML Sitemaps, carefully crafting internal navigation and working to drive more links to deep pages, all of which are certainly recommended techniques, but eventually reach a point of exhaustion.


My recommendations are often to try a few alternatives, including:

Eliminating a large number of pages, particularly faceted forms of navigation (making them accessible only to logged-in or cookied human users and employing rel=canonical), but also products that have very low search volume, no inventory, low margins or frustrating availability. By limiting your product catalog online, you can then achieve full indexation and build upon it.Creating product feeds, product category blogs and even category/product Twitter accounts to help send indexing signals to the engines. A blog about each of your main categories featuring posts about a few products each day via something like a Tumblr blog can, with a small amount of editorial effort, enable indexing of a few new URLs each day. Over the course of 12-18 months, this can add substantively to the bottom line and be reproduced. Ditto for Twitter and product feeds, though both will need to provide real value to subscribers/consumers (perhaps "deal of the day" type content) to earn subscribers/followers and show the engines they're not just empty scrapers.Rewriting or adding to the written content on a few hundred sample pages that aren't being indexed. I'm frequently seeing that what appears to be a lack of PageRank/link juice to earn indexing is actually a case of "not enough unique content." If the site is seeing regular crawling to pages that don't make their way to the index, this is often a worthwhile exercise.

When you reach the tens or hundreds of thousands of pages and all need to be separately indexed, the resulting need for more "unique content" on each page can seem an overwhelming task. The common approaches are to either hire/contract/find in-house editorial writers or leverage user-generated content to help boost the content uniqueness, but other approaches are also available.

Human labor using sources like Mechanical Turk or similar services, which I've written about extensively in the pastBuilding content the Google way - by aggregating the popular words, phrases and sentences others use to describe it (with citation of course). As an example, see how Urbanspoon quotes restaurant reviews or Rotten Tomatoes aggregates critics' reviews. You could even add multimedia content with YouTube, Flickr or other sources. Just be aware that editorial content and review is still critical to make sure these pages are adding value rather than just automatically scraping and re-purposing.Prioritizing. Many site owners seem to feel that a unique-content project means that every page deserves equal attention, when in fact, it's likely that giving 80% of the effort to 20% of the pages is a much smarter play. Determine the pages that add real value, add your content efforts there, and see the impact before moving on to the long tail.

I see marketers banging their heads and their link building efforts against a wall, trying to outearn a competitor with a strong lead for a particular keyphrase (or a small handful).


Instead of trying to beat them at their own game, why not work around the system?

Try alternative keywords that could get at the same audience before they're conducting that specific, high-converting searchConsider video content on the major platforms and your own site (using the Video XML Sitemaps protocol) to earn video rankings on the same page (which often draw as many clicks/visits as the first few results)Create news, blog posts and tweets to help trigger the QDF algorithm and get alternate content types you own in front of searchers and ahead of the first "organic" resultWin the social, branding and "mention" battle, which will often turn to links and recommendations over time, eventually earning you top rankings.Influence search queries and content on the web through branding, news, social media, content creation, etc. to make Google's Suggest/Instant feature recommend more targeted queries that you own in the rankings.

Perhaps the most comment complaint I see in the white hat v. black hat back-and-forth is that white hat link building never earns ideal anchor text. Bollocks!

Profile and biographical paragraphs are one of the best ways to earn the anchor text you want. My professional/event bio has made its way to dozens of sites, all of whom link back in the manner I've requested. These are 100% editorially given, white hat links, often from powerful media or event sites.Press releases that get picked up by news media sites will often leave the link anchor text you've Guest writing / guest blogging for a relevant publication often allows for a link back to your site. If you're creative about the formation of that link, you can insure the anchor text is ideal (or close to ideal).Widgets, badges and embeddable content have fully controllable link anchor text at the time of production - so long as you're not manipulative or appear spammy, the links will point back in the way you've chosen/created.Titling products, pages, posts and essays with the keywords you're seeking means that those who reference the work will be much more likely to use those terms/phrases in the links others create.Requiring specific anchor text via citation when giving away or licensing content is another way to insure you're building optimal link text.Finding friends, family, employees, co-workers, etc. who link to you and reaching out directly to have anchor text modified can result in substantive quantities of optimized anchors.

And, for posterity, I'm going out on a limb here and predicting that exact match anchor text for commercial terms is likely to get considerably increased scrutiny in the next year from Google (see my prior post on how this might be done).


It's true that many times, the basic best practices are the right way to start, and may even be the right solution. However, more organic marketers need to be thinking outside the box, as classic SEO becomes more competitive and dominated by entrenched players.


Please feel free to share your own creative solutions in the comments!


View the original article here

Ask and thou shalt receive

If Google is smarter than humans, we must accept that it should be able to help us answer the difficult questions about life that are vital toward making humans reach their full potential, such that we may help computers become smarter, so that we may reach the singularity.


Sure some folks who took some funding are trying hard to build real communities around niches, but they are doing it all wrong.


The folks who are doing it right seem to have the answer to everything. Millions and millions of answers. The modern day Matthew Lesko of the search world.


Ask has long played the search arbitrage game, but they are stepping up their game.


Every authoritative site should have an answers subdomain.


Every site is an opportunity for more answers.


Why shut a site down, when you can just throw up some scraped & autogenerated pages and wrap them in a Google ad feed that pays out over 80%?


Even if you have redirected a site as a defunct relic for a decade, once you have your auto-generated content in place you can simply throw the domain in the hopper and generate a few million pages.


Why did Ask fold their search engine & focus on Q & A? They claim the following:



"The development of search as a technology has become commoditized. To continue to invest our own resources to do web search doesn't make sense because that development is expensive and doesn't give you a differentiated product," Ask President Doug Leeds said by telephone.


My contention is that their is no value spending the engineering resources to fight auto-generated spam if Google is paying you to create it. At some point one stack of money becomes much larger than the other.


Then again, speaking of differentiation, I wonder if Doug Leeds would care to comment on if answers content "has been commoditized" at all by them skirting around the intent of fair use laws (much like Youtube did to video content). Are they offering a "differentiated" service by turning tons of their sites into giant answer farms?


Ultimately this is much like Mahalo, but on a grand scale. At least they are not pointing expired redirects into their site (like eHow did) but if this trend continues look for thin answer sites wrapped in AdSense to become the equivalent of the auto-generated affiliate feed powered website of years gone by. The model is infinitely more scalable than content mills since the companies doing it don't actually have content costs: throw a keyword list in the hopper, send your scrapers out to "add value" & watch the money come in. Wherever something is working simply throw more related keywords in the hopper.


The lack of cost to the model means you can build thousands of pages around misspellings and yet still have it be profitable...the cost of creating each page is under a cent.


Who funds the creation of all this garbage? Google, via their AdSense program. It's a bit of Southern Hospitality from Google, if you will.


Own a forum website or answers website & are sick of seeing Ask outrank you by leveraging their domain authority + "fair use" of your content? Here is how to block their bot in robots.txt:



User-agent: Teoma
Disallow: /


Google has the ability to warp markets as they see fit, be it ad exchanges, tax policy, copyright, trademark, or hard coding the search results for self-promotion. While reading Gmail a couple minutes ago I saw the following ad, which I think prettymuch sums up Google's approach to search: monetize everything!


With great power comes great responsibility, however working on the Google spam team must feel a bit like the movie Brazil when watching this stuff unfold.


Remember how all kinds of affiliates were given the boot by Google for not "adding value"? How are lander pages like this one adding any value? 10 of 10 above the fold links are monetized. And it looks like their sites are using content spinning too!


The promise of the web was that it could directly connect supply and demand to make markets more efficient, and yet leading search engines are paying to create a layer of (g)arbitrage that lowers the utility and value of the network for everyone else, while pushing even more publishers into bankruptcy as the leeches grow in size & number.


My guess is that unless this short term opportunism changes, some of the star search engineers will leave in disgust within 12 to 18 months. Mark 2012 on your calendars, it will be a good year for clean search plays like Blekko and DuckDuckGo. ;)



View the original article here

Now all bloggers be outreach campaigns spam be too:)

Not that I have been marked as infographics long ago largely destroyed, as a link building tool of some unscrupulous people who offers were to pay people to host of information graphics to their sites-in a sense the word infographic makes seem & feel spam just like paid links. : D

After killing this source links, the people behind this work are proud to fake charities and awards for bloggers on mobile, start multiple fake charity sites over the last few months and then mass of emailing bloggers with fake awards and link offers purchase, together with a touch of comment spamming.I mean it isn't , that many comments.:)

Each website with only 4 pages of content, which claims to be a non-profit has still no contact information, while 3 or 4 year old claims (even the domain name was only registered 2 weeks ago) is probably somewhat sketchy.

Hello
I would love to buy a text link ad on one of your pages as _

The link would be a finance or MBA Web site.

My budget is $100 and can pay by PayPal.I can that a call through
More information let know me details or email.
--
Maggie sands

As marketers want to everything you do the sniff test to exist.So if your stuff something like this fake scammy crap crap are then you not released not much success with his haben.Vor all because the people who send millions of emails go to sterilize the market and cynical towards even more marketing Techniken.So onto the Web to your marketing efforts to make that much more personalized, and it also helps to have a real presence in the field, that way blogger will you as just another scam How could people promoting fake charity angle.

This is another reason why it also helps that things in completely unique formats erstellen.Die gamers use stuff burn - out an opportunity after the others, but most of their new and creative oblique are simple extensions of the things that for other gearbeitet.Getting out to the scammers on a new trend and format is far more profitable you are marketing ideas one as their Spuren.Aber note, curve you haben.Eine idea that starting pure is successful and profitable under the eye of the scammers at a certain Punkt.Und based most scams around try come at the end, look like the real thing, so finally, a format that was finally once profitable loses its power and you must to move.

The best forms of marketing, which help you distinguish yourself from fraudsters are those to build confidence in the course of time: branding, awareness, social interaction, etc.The person that a puppet might be able to compete with you here and there from time to time, but if you have something to do with depth and substance to build a hard time that on a sustainable basis to compete.


View the original article here

Friday, January 28, 2011

How does a spammer?

Search engines are powerful because are editorial filter, encouraged the relevancy.


We often marketed that any errors or omissions by search engines are not due to the bad algorithms, but rather to do unscrupulous spammers.


Webmaster guidelines are arbitrary and ever-shifting and preached as gospel. That 'or else' is a key component of the algorithm fear thinking.


And even if some of the largest and outrageous guideline violations to light are quickly dismissed & withdrawn under the carpet.


In some cases engineers of SEOs Search merge with hacker illegal activities do but when all marketers and advertisers were criminals Google.com would start given this list, that ~ 99% of their revenues come from show and less than 100 countries have sales a GDP larger than Google. : D


Any further claims against spammers include vacuity. That was right before I search got game (and in some edge cases may be true today), but most spammers try to be relevant. Back in the late 1990s, when "Do any Preview" decided banner advertising on the Web, all it takes to was by no means profit page views. But how marketing has become more precisely and accurately measured, there are more relevant. More of true conversion powered performance with current online marketing, relevancy is key. If you show up where they are not relevant, just wasting your time and money.


Search engines have a CPM higher than virtually any other type of media format just because your ads are so relevant.


We skip the fact that the Google ad system to maximize the return on investment is set up to ignore, but that Google AdSense has a get rich quick ad category. Overlooking those, is the core argument against spammers, you pollute your organic search results and use to bring Google's distribution inferior product on the market.


Do you know who else does?


Google.



Yelp Inc. CEO Jeremy Stoppelman has complained about Google's use Yelp content for Google place pages and negotiating the problem with Google. He said Google is "trying to strengthen its distribution" - the search engine - "to" take an inferior product and put it in front of the user.


According to the above WSJ article's searches Google TripAdvisor from over 10% since Google place pages won prominent placement in search results. Not only Google borrow 3rd party content and then use to displace the source, but you also not numbers 3rd parties that same to do.


What is the real reason Google hates spammers?


Competition.


In Google's ideal world you would build a media empire by scratching, who you want content is money earn it however you like, and pay a prescribed proportion of the revenue partner, right until Google finds another partner is willing to accept less.


It is true, with text, with the communities, with news, with images, video, labeled and soon eBooks.


Here millions of doorway pages are created the Google (and many of them are ranking in Google, even though users ' search results in the search results ' don't like)


It reminds me a lot of Richard Nixon: "even if Google does it, that is, it is not spam." Indeed!


Careful out there, the world is a spammy site. ;)



View the original article here

SEOs should focus on where Google is heading

Interesting small excerpt from Mr. Cutts:


"Matt recommends not SEOs" hunt the algorithm "and instead try to predict where Google will be in the future."Matt was PubCon addressing.


Good advice, methinks.


Trying to predict where Google is going, is something a lot of SEOBook.com do we while nobody has a crystal ball, it is good practice to keep an eye on the search horizon.


So, think we where Google could position?


Simply huh.


Their biggest competitors appear clueless when it comes, to suchen.Bing can make some progress. Vielleicht.Es is hard to imagine that someone food Google's lunch, when it to comes to search, for many years to come.


Facebook is a threat? I doubt it.Search is difficult, and I see no reason why Facebook - that has a media focus - could have the search channel, more than Yahoo could.


Search is, after all, an infrastructure problem.Google's infrastructure would be very difficult to replicate.


A search result set to only spam contains really, when Google users think they contain spam, i.e.. You do not see the answer you were expecting.


The fact that a Web site outside of Google's policy can fall may webmaster knickers in a knot compete, but it's probably not matter that much to Google, or anyone else.


Although this will devote more resources, Matt Cutts Google says Google's efforts largely on outright deception i.e. deception, I suspect hijacking and malware will stay focused.


We can forget the San Fran techno hippie ethos of the Web. It will not be a scramble democracy, if it ever was.History shows us that makes trying to centralize control to get it.


Google can try users on Google for more about halten.Sie do so by more and more vertical to own and extracting data and Neuformatieren.Wenn visitors from Google send away, will you try to do so more and more on their own terms. Carefully observe what kind of sites, Google rewarded in contrast to, reward what you can say.


Expect less competition on the market as a result.Some people are angry about it.


Google follows users.So does Facebook. All your users, you will have, there to sein.Auf Google maps.On YouTube.Where and when always.Think about your site.Think your data out there.


Rich Skrenta noted in an recent interview:



Social media can drive tons of attention, awareness and traffic.But the search box is the best way, to fill navigate now what is this results - drive to. If I type in "Pizza," what should I get?The answer can vary greatly, depending on whether the results from the Web, Yelp, or Facebook come.I think my answer is that I still see search the core way to navigate, but I think, what for, gets to get much more structured and move away from simple keyword matches against unstructured Web pages


Microsoft research has found that people tend to, organize your memories in geographically i.e. where you were when something happened.


If you want to know where Google's position, then see you Marissa Mayer .Marissa was responsible for much of what you look at Google as it organized ist.Marissa moved just the head of the geographic and location services.


Google Earth.Google Maps.Google Local.Google Street View.Mobile site data and targeting.Erwarten more data to place is organized.


Aaron talked about TechCrunch's tendancy to over hype new developments:


"..".. .but changes this everything..."


SEO has not much over verändert.Wir still an audience (keyword research), we publish, we build links to the content and we repeat it over and over again.


The changes come at the edges, a lot of risk to Google, radical changes there especially for major companies like Google.Es. shareholders may risk nicht.Warum breaking something that so much and is so popular?


The biggest changes in the way we are probably things on the Web from the upstarts kommen.Sie are now probably hard at work in their garage.



View the original article here

Search power plays, and how to avoid first crushed

The little guy often loses.


How niche saturated, the winners are typically those with the deepest pockets.


Until the last few years, the little guy able with SEO gedeihen.Der was little guy isn't much competition from large companies face, because the large companies have not SEO bekommen.Aktuelle Algorithmns and corporate strategy Google often side effect the beneficiary enterprise.



According to Google CEO Eric Schmidt, the Internet is a "cesspool" where thrives incorrect information.... Brands, he said, are the way the cesspool rise


There is a danger in reading too much into Schmidt's words, but this statement reflects much of what is happening in the search results.A large company or brand with a Web site can be searched, it simply dominieren.Ein bets big company will have link, discussed in the media and rewarded query volume - all factors, the Google keyword. All these factors are to emulate more difficult for the little guy.


Current factor in Google's moves "own" industries, and many more small guys be crushed beneath their feet.It does not matter, if your site is grey or black if your site directly with a large company competes or with Google that now are a big company - you almost certainly lose white hat.


This is not only true in the SERPs, of course.It is also true in AdWords which essentially rewards those with deep pockets. It is true in print. It is true in all Medien.Es is true in politics, money market and in life.


Power is.


Even if will face competition from large operators not, you competition from a million other small guys, face, especially if it not an obstacle to the entry.This is often the case where Netz.Check out this article by Tim O Shea, founder of the short lived UK group, the website Snippa to kaufen.Snippa was similar to GroupOn.



Due to the number of players are the Commission only, reaches levels far from the 40-50%, GroupOn to 0% business, eroded (our offers always many phone calls from potential group buying companies averaged at Snippa about are 10-20 %).Händler and conversation with many is more calculated through the Commission level rather than how could offer a great discount for a group of new customers.) This continues until one is clear market leader can themselves prove a large customer base enable you better offers and Commission to negotiate levels.Many companies, the hunt for the same deal is counterproductive productive for the end customer.


Too many competitor Errode margins to 0 (zero). Finally, wins the largest operator.


When to get a niche in search, how would you rate it?


Look at the search volume and search a site top position ten for the volume search? An ok strategy and is used by many in the SEO business.


However, you can go a step further.


If you think in the long term, you must consider other factors, especially competitive threats.Questions: is this niche be probably so lucrative that it will attract large enterprises?, if so, then a strategy to be may enter or be bought out.You can win such a struggle for a while, but the large companies will inevitably win at farther and purchasing power.


Are the cheapest, or are the best?


Select one.


The little guy is almost always better with the aim of the best thing you will do.The cheapest required volume and is very difficult to maintain.Many companies, both large and small, get into a downward spiral of price cut locked.Again, you will last is the cheapest until a larger company emerges.Larger companies can get price by volume.If the Internet equivalent of Wal-Mart is your competition, you are in trouble if you compete on price.


Zappos was a small company that eventually became a big company to compete not on price, but competition in services.Aimed at Service.Sie on price won the best had you would get up-to-date everywhere great shoe and clothing you would have crushed chains.


SEO is your only strategy dominate a niche? if so, you are vulnerable to the whims of Google.stattdessen, you think of ways you can use a marker to entwickeln.Ich the best man in the world use the trademark term in the broadest Sinne.Wird with over say to speak, is the food preferences of the neon Tetra fish-a Marke.Was whatever it is you, if you are not about the price competition goal, the best to •devices to carve a niche even finer, do it, at least until the costs outweigh the benefits.


Think of ways that you can keep locks in customers/visitors and you, zurückzukommen.Haben you only search volume, then leave before people you seen haben.Ermutigen visitors bookmark, or sign up for a Newsletter.Haken you in any Weise.Werden especially unvergesslich.Wird will create unforgettable search volume from nothing (searched for how many people for Zappos years ago? or SEOBook?).Building an audience might not enough large companies block, but it helps other small companies and new entrants, fend off, especially if only on SEO trust.


The big guy in the small niche be:)



View the original article here

The SEOmoz internal SEO pre launch checklist - Whiteboard Friday

 As we all know, SEO is a very labor-intensive job. It can be really easy to let some things fall by the wayside even if you know you're supposed to do them but don't have the time. It gets even more complicated when you forget to do them in the first place! Now, Danny is an awesome man of many strengths, but he can be a bit of a forgetful grandmother at times - even he knows that it is important to write down processes so they're easier to replicate in the future. As an early holiday gift to us all, he has decided to spill the optimization beans and share his SEO checklist with the community he loves. Get the details in today's Whiteboard Friday video, and the more general checklist in the post below. Please feel free to share what's on your checklist in the comments below!


Hello everybody. My name is Danny Dover. For today's Whiteboard Friday I'm going to show you something very, very special that we use internally -- the SEO cheat sheet for launching a website. We couldn't really think of a good name for it, but that's not the important part. The important part is the content that's on it.

Let me explain where this came from. We have a VP of Product here, his name is Adam Feldstein. A lot of times he'll come to me and there's a page that just needs to go up in the next two hours. Sometimes it's a landing page of some kind. Sometimes it's a marketing page of some kind, and SEO comes in right at the end. Usually what happens is I look at Adam like, "Well, Adam, you know you shouldn't do that." And Adam's like, "Danny, why are talking like your grandma? I don't understand that." The idea is that SEO, of course, is supposed to be within the entire process, the entire product process. You work it in every step of the way. To be fair, at SEOmoz I think we do a good job of that, but sometimes we get these side products, it just has to happen at the end. In those cases, this is the cheat sheet that I go through to make sure everything gets covered.


So, I've very deliberately broken this into two category groups. I have critical on my right. And I have not quite as critical. So, I'm not saying that they're not critical. I'm just saying that they're not quite as critical. So they're all important. The things on my right are the most important things to cover. So let me cover these one by one.


The first one is targeting. I can't emphasis how important this is. A lot of times when I get mockups back, there will be many, many ideas expressed on a single page. Sometimes they'll relate to each other and sometimes they won't. The idea here is in a perfect SEO world, which we don't live in, but if we did, there would be one idea per page. The reason I say that is because from a search engine perspective if there is one idea per page, it is very, very easy to figure out what that page is relevant for. A lot of times when I get these mockups, part of the process is figuring out what is the bigger idea that is trying to be expressed and then targeting that phrase. So, if it is Justin Bieber, if we're working on his home page for example, I don't know why we'd do that. But, if we did, it would be just him as a celebrity or his personal reputation. The concept of Justin Bieber, that is what the homepage of him needs to express. That would be what we would target will all these different things. I'll cover all of these. Once you figure out what you're trying to target, then you need to go through and target them in the correct ways.


The first thing that comes up is the content. We talk about this a lot in SEO. The content is king. If I had a dollar for every time I said that, I'd have a lot of money. Content is extremely important. The content that is on the page is why search engines are indexing in the first place. They act as middlemen or as middle machines as it would be. Real human beings go onto Google and they search with real queries and they are hoping to find content that is written for them about whatever it is they're searching for. So the content needs to be about whatever you're targeting.


Underneath the content is title tag. We've done a lot of research on this at SEOmoz. We've found that title tags are extremely important when it comes to on-page ranking factors. In fact, they're the most important on- page ranking factor from an SEO perspective. Titles tags, what you want to do is have the keyword phrase at the beginning of the title tag. I'll link in to a post below that explains all the intricacies and all the subtleties of title tags so you can get a good idea.


After I check over the title tags, I look at the URL. So the ideal, I want the URL to be as short as possible and as semantically clean as possible. Does it make sense? Something like a domain, a category, a subcategory page, and the content. Does it express an idea that makes a lot of sense? So if it's something like SEOmoz.org/content/blogpost/whatever the blog post name is, that would be a cleaner URL. Just for example.


Under the URL, I have meta descriptions. Meta descriptions don't help you at all from a ranking perspective, but they help you with click through on the search engine result pages. This is kind of a way of doing a free ad for search results. I always make sure that we have one of these and it includes again this target phrase that we picked earlier. If it is in there and there is an exact match or a near match, it will bold it in the search engines, which helps which click through rates.


Underneath that, I have rel=canonical. This one has come up a lot more in the last I'd say three months or so than it has before. This is very important. This is a resource the search engines have provided for us that we can tell them what the canonical version of a web page is. You'll see this a lot where tracking parameters mess it up or a trailing slash versus a non-trailing slash on a URL will mess this up. It is very important that you include this here. Actually, at SEOmoz, we're trying to push an initiative through dev to have this on every single one of our pages. That's something that I think will come here in early Q1 201l, making sure that we're making it very clear which is the canonical version of every single page on our website so that link juice doesn't get distributed unevenly.


Underneath rel=canonical we have Alt text. Alt text is a textual representation of an image that you can optionally apply. This is helpful for search engines and for human beings as well. So, human beings that need screen readers or that are using some other alternative method of visualizing web pages, Alt text comes in handy. We have found this, along with title tags, to be an oddly, highly correlated search metric. Alt text, again you're going to want to use the target phrase you chose in the first step.


The last one is internal linking. This is the easiest one to screw up. When you have a new page that is about to launch, it is important to look at all of the things that are on the page itself, but it is also important to look at what pages are linking to it. Guess what? Again, you want those anchor texts to be using that targeted phrase that you came up with in the beginning. If it's the Justin Bieber thing again, you're going to make sure that all of those pages on the rest of your domain are linking to it with the correct anchor text, which is whatever you targeted.


That's the critical stuff. That's what I look through in the first five minutes of going through a new page that is about to launch. The other things that I go through have a lot more subtleties to them. So I've put them into a different category. Sometimes you want them and sometimes you don't, depending on what the situation is. I'll go through each of those independently.


Meta robots. Meta robots is a tool that search engines have provided for us that allows you to either have a page be indexable or have a page not be indexable. It's sort of like robots.txt, although it is a lot cleaner implementation. Let me explain that. With meta robots it gives the option, like I said, of being indexable and non-indexable. It also gives you the option of having all the links on a page followed or no-followed. So whereas with robots.txt you can say, "Google, don't crawl this page," and it won't, with meta robots you can say, "Google, crawl this page. Don't index it, but have all the links on it pass juice." That way any link outward from your domain that you do or any link to a different page on your domain on that web page will still be indexable and will still pass juice. So, I never actually recommend using robots.txt unless you absolutely have to, because the benefits of using meta robots just outweigh it completely. Again, I'll link to this and explain it more fully.


Underneath that, I have meta keywords. Meta keywords are an older SEO tag. In fact, I think it's almost completely useless at this point. This year it came out that Google was not using it, Bing was not using it, and Yahoo came out and said that they were not using it, although Danny Sullivan proved that they actually were using it. What happened now is that it doesn't really matter that Yahoo is using it because all Yahoo search results are getting transferred over to Bing search results. Meta keywords are actually not going to help you. It's one of those things where not only will it not help you, it could also potentially hurt you. You're giving your competitors valuable information by providing meta keywords. You're saying these are the keywords that are important for me to target on these pages. You don't want to do that. You're spending time to help out your competitors, which is something you want to avoid at all times. The best argument I've heard against this is that with meta keywords sometimes they can help you on some social sites. I really haven't seen it on any of the big social sites. On some, like, some very niche sites, meta keywords can help you. Really, the bottom line for me is it is not worth your time to go through and add those on each of your pages. So, I never recommend doing that.


H1. H1 is one I go back and forth with a lot with different SEOs. The idea behind an H1 is that you are using HTML headers to explain to the search engines and to different protocols how information relates to other information on the page. So, with H1s we found from a pure ranking perspective they actually don't help you very much. We think this is because they've been abused a lot in the past. But the problem with just going out and saying that and making that the best practice is that H1s are actually very helpful for users. If you go to a blog, it makes a lot of sense that the H1 will be the title of the blog post. This is what the entire page is about, so this will be the H1. I totally agree with that. That is how it should be. From a strict rankings perspective, H1 is probably not going to help you very much. Maybe not even at all. But for users it helps a lot. I recommend you include them, but don't put a huge amount of emphasis on them.


Underneath that is cloaking. Cloaking is something that usually comes up by accident, although some mischievous people do it on purpose. Cloaking is showing one thing to search engines and showing something completely different to normal users. This comes up a lot on our website when we have one version that is being shown to logged in users and one version that is being shown to non-logged in users. A search engine cannot log in. It doesn't have credentials and it can't operate that way. So it is very important to figure out exactly what the search engines are going to see and make sure that other offline users are also seeing the same thing. If you're ever in a situation where you are targeting something based on user agent, say in Googlebot, you probably don't want to be doing that. Try to avoid it in most cases. There are some very limited hyperlocal exceptions to that, but feel free to ask questions in the comments if you want me to expand on that at all.


Capitalization. So, I'm actually going to group capitalization together with trailing slash. These are again talking about subtleties of URLs. Capitalization is if you are going to include capitals in a URL, and trailing slash is if you have something like www.SEOmoz.org/WhiteboardFriday/. If you have that trailing slash, that page will render, at least on our servers, that will render along with the page without the trailing slash. This is a mistake. This is something that we are going through and fixing. I think a lot of people make this mistake, actually. The problem with this is you're creating duplicate content. The same thing can happen with capitalization. If the URL has some capitals in it and you can also render the same page at a version that does not have the capitals in it, you are going to have duplicate content. These are two things I take a look at when I'm going through a web page that needs to go out into production into the next hour or so. I make sure they do not have these two things.


If you have any more questions, you want to expand on this at all, or you have anything I've forgotten, please comment in the comments below. I appreciate your time. I'll see you next week on Whiteboard Friday. Thank you.


If you have any tips or tricks that you've learned along the way, we'd love to hear about it in the comments below. Post your comment and be heard!


View the original article here

Thursday, January 27, 2011

Investigating and prosecuting duplicate content problems

A recent post by Paddy Moogan from distilled about when to use a 301 redirect and when to use rel = canonical got me thinking about all the possible ways we can fight duplicate content issues.


First, for those who are new into search marketing; a duplicate content penalty is a consequence that the search engines impose when they find large amounts of text that have been copied from other sources on the Web. Some would argue that the search engines are simply filtering you out of the SERP's (search engine results pages) in effort to deliver more relevant, fresh content. Anyway you look at it you won't benefit from it and therefore it's a penalty in my eyes.


Duplicate homepages can be seen as individual pages, possibly discounting the merit that your true homepage has earned. If your site homepage can be viewed like the examples below, you may want to continue reading to correct the error.


http://www.example.com or http://example.com are both good, but it needs to be one or the other.
http://www.example.com/index or / home or / homepage needs to be corrected.


There is also the possibility that someone has outright stolen your content. If that content you created has already been crawled and established itself in Google's index, odds are that thief isn't going to benefit on the search engines. Ideally they'll just get filtered out.


Creating dozens of versions of the same article to distribute to article sites/networks is a rather popular link building technique. While I won't take a stance on its effectiveness, if you use an article that is already on your site and create numerous versions of it, it can come back to bite you because the search engines can still see the correlation between the original and the copies spread all over the Web. It's quite possible it could even discount those included links further.


Some shopping cart content management systems can have different paths to get to the same product or category page. Why is this an issue? Well if those two different URLs are going to the same product, then it's fair to say that those are duplicate pages.


However, if you have a blog and you're worried about your different categories having duplicate content because of the different categories you posted it in; the search engines are keen to this and understand blogs. So, the more posts you get in those categories, the more it'll mix up that content preventing any sort of duplicate content problem. Same story with post snippets.


One way is to browse your site to see if you have any above examples of the. Another is to type your URL into eco friendly. Keep in mind that when you do this, it is only showing you the result for that exact page that you entered, not sitewide. So, it will not return results of duplicate content that you have on the URL same that you submitted your query for.


First, the odds of you from hurting other people stealing your content isn't very likely. Lookup SEOmoz.com in copyscape.com and you'll see that there are pages of results but because they were the originators of the content, it's not likely that they'll be filtered out or receive any sort of penalty.


If you have content that other people have copied or stolen, you can try e mailing the webmaster and kindly asking them to take it down. Chances of them responding aren't very likely so the best thing you can is probably just forget about it people steal content left and right on the Internet, dwelling on it just wasting your time when you're is probably not getting penalized from it anyway.


Luckily if you are getting penalized because you have duplicate pages, it's on your end of things and it's relatively easy to fix. If you have duplicate homepage problems locate your .htaccess file.


Add the following code to redirect all your www URLs to the non-www URLs:

RedirectMatch: 301 ^(.*)$ http://domain.com RedirectMatch permanent: ^(.*)$ http://www.domain.com

You'll need to replace "domain.com" with your URL as well as change whether you want everything to go to www or non-www.


If you need to get rid of your / index / home page problems you'll need to implement a simple 301 redirect. This will need to be so specified in the .htaccess file using the code below:

Redirect 301: / badurl.htm http://www.example.com/

Change the example URLs to make sense with your particular situation.

Redirect 301 / index http://www.example.com

For more clarification, it's telling the site to permanently redirect your / index to http://www.example.com leaving you with a clean URL structure. Now, all your duplicate homepages should go to either http://example.com or http://www.example.com, whichever you preferred.


For example, if you have a product site that has more than one way of getting to the product, those duplicate URLs could be hurting each other. For example:


http://www.site.com/iPods/skins/blue-iPod-covers vs. http://www.site.com/skins/ipods/blue-ipod-covers


Same page, different URLs. In this instance, using a rel = canonical tag is in your best interest. Using it will tell the major search engines that the page that copies your other page should be treated as one in the same. For example:


If http://www.site.com/ipods/skins/blue-ipod-covers isn 't the correct page and you would rather have http://www.site.com/skins/ipods/blue-ipod-covers be the main page, you' d want to put a rel = canonical tag on http://www.site.com/ipods/skins/blue-ipod-covers. This way the search engines understand that it's a user-generated duplicate page and that you want all the links and other metrics to be directed towards the right page. No longer will the search engines be confused on which page to display or give credit too.


Using the rel = canonical tag is an alternative to programming a 301 redirect. A 301 redirect is still the preferred way to guarantee the search engines understand your intent to move content from one URL to another.


In addition to fixing potential duplicate content issues, treating the two separate pages as one can help any keyword cannibalization that could be going on.



View the original article here

An over optimizing nightmare: Staying from Google's naughty list

Disclaimer: See post demonstrates a personal experience. Not necessarily SEO.com or work reflects the views and opinions in this post with their customers.


For the most part, the link building is pretty straight forward and simple. To publish your articles, request some directory listings, bookmark links, guest blog posts, request links from other webmasters or even purchase links if you feel especially rebellious. But keep in mind have a strategy behind it, could fall face-first into a ditch full of sadness and remorse.


Many times so many of us start a Web site in the hope that in 5-6 months, we may see some decent cash, the reels begin in. Because you need to link building to achieve those classifications must ensure that the links you purchase are the progress of your match site is currently on. Let me explain.


The most experienced professionals understand Search engine optimization that you need a healthy balance links. To keep building links in moderation and a proper anchor text not anchor text ratio is essential. If your whole backlink portfolio of anchor text links, not to the search engines look natural. Same can be said when everything is a directory link, bookmark link and above all a comment link.


If you want to submit articles, make sure you use your anchor text, but also make sure that some of these links strictly are back the URL to your website or company name. If your website is brand new, the number of anchor text branded links should probably a 50/50-ratio, so that your backlinks is not unnatural look.


However, not the same thing about large, established sites can be said. Chances are that if your site has 40,000 backlinks, submit higher ratio of anchor text links are not hurt you or your ranking. For example, if you 1000 + spammy anchor text referred, filled comment links to YouTube, you think it does make a difference? On the other hand, if it has the same, a brand new site with no name or authority, you will receive a penalty probably very quickly.


I want to share a personal experience with this. On one of my personal websites not I was following my own advice. I got in the habit of the template content using my anchor text. There was variation of anchor text, but never I threw to my URL make, appear more natural.


All I saw was a boost for a few months, and for two of my most important keywords I reached even first page rankings. I was very happy and full hoping that this site might actually bring in some money. Then, on a fateful day, Google dropped the Hammer…



As expected, was I very disturbed, to say the least. After looking through my backlinks I found that I was not clearly proven method following. I was not enough natural looking, credible links building. Instead I got my fantastic rankings caught and continue to submit content, folders, bookmarks and other links with only my anchor text.


Because I was a new site with a limited online existence, this building links months but it worked for nearly two caught up with me. When I was a Web site with some authority and a very healthy, natural-looking backlink portfolio, this would probably have happened.


Keep in mind that the links back to refer to your site, when it comes to your anchor text and the link built method to vary. I think the same analogy (for the most part) applies for life, "too much or too little of everything, is a bad thing." "Keep everything in moderation."



View the original article here