Showing posts with label seo tips. Show all posts
Showing posts with label seo tips. Show all posts

Thursday, November 12, 2009

Exploiting Longevity For Google Search Results

Here’s an interesting observation I just made to justify how I am writing to the web: Google rewards longevity, and doesn’t care so much about readability. A few months ago I did a revamp on There Is NO Box, and I am continually amazed at the traffic I get for years-old posts which are (frankly) not very good. Some of these are posts with one word titles, or posts that are very short, or just generally lame.

Currently on Website In A Weekend, I’m producing out a large number of articles, some of which are more technical (and take more time) than others.

But in all cases, when I feel the article is “good enough,” I pull the trigger and publish to get the permalink into Google’s index. I have read that this strategy is called “Publish, then polish” (watch for an upcoming article explaining publish then polish in detail). It goes against every bit of formal training I’ve ever had as a writer, where “quality” is the only overriding concerning, articles take a long time to produce, and inaccuracy is severely punished. Unfortunately, long, in-depth and technically detailed articles on blogs are punished by the “lack of readers” effect.

Using “publish then polish,” Google is now indexing me everyday, sometimes within hours. My highest traffic post so far published on Website In A Weekend was published the morning WordPress 2.8 was released, and got a couple of dozen hits by the end of the day. It continues to get traction. But there’s good posts on this site which haven’t yet gotten a couple of dozen hits total.

Right now a couple of dozen hits is pretty good, for any post.

Website In A Weekend doesn’t have many other readers yet. Google is probably the most active consumer… although Google sends very few search results this way.

At some point this will change. [Update 7/6/2009: in the three weeks since I wrote and scheduled this article, my daily traffic as roughly doubled!] When I do start getting traffic, I won’t feel compelled to write quite as much, and will concentrate on more refined output.

Until then, cranking out content in bulk seems to be the correct strategy. Long time readers know I’m on a schedule to complete 101 articles on WordPress, which is tentatively complete around August 10, 2009.

Whether the “publish then polish” strategy leads me astray in the long term remains to be seen.

What’s your experience?
read more...

Wednesday, November 11, 2009

Title Pages for Google and Yahoo

Title Writing Tips pages
Search engines make the title page as one of the important reference relevant in determining whether or not a page. Pages this title is a sarcastic title of a page, so should have a title reflecting the content of the page content.

Some things to note are:

1. Number of characters
From sources that I read your page title optimal if 80 characters is optimized for google and yahoo ranges to 115 characters.
The point is if you exceed the optimal within reasonable limits, do not be bad, but of course taken into account the density of your title, may be better if you use it makssimal 80 characters.
Meanwhile, after I did the experiment, Google is only 65 characters max menampilakan in his SERP, and yahoo about 115 characters with spaces and punctuation.
From here I also found, that the title page after the characters to 65 from google not very helpful, in this case there is a site that terlisting no 1 on google with keywords "I do not call". This site has a title that more than 65, I add keywords to the "in Indonesia" (these keywords in the position after the character to 65). Apparently, this site even defeated by the sites in its pages none keyword "Indonesia".
As a second note the site has a pagerank 4. From this I conclude title after 65 characters to a smaller impact than the inbound link text. (The second site that does not have keywords "Indonesia" in dalamanya probably because it has terlisting inbound text link "Indonesia").

2. Do not use the words in the search engine filters.
For sites that speak "English", words like is, are, etc. should not be included in the title. As for Indonesia language site, it should be influential, as far as I know now, the word "the", "and" etc. are not filtered search engine. (IMHO)

3. Dispose of the words that are not useful
Words such as â € œselamat arrived at the site kamiâ €?? better etc. not included.

4. Reduce the use of separator
If possible do not because it has a long title be added separator, eventually the number of characters exceeds the maximum amount recommended karekter.

In the title selection is also not free from how to choose good keywords. And do not forget in your item has a good density of the title / title you choose.
read more...

Oes Tsetnoc Buzz Growing In SERP

I don’t really know why this happen lately, yet if you have a time to see it in Google SERP, Oes Tsetnoc buzz is growing. It’s so different compared to last week. As I remember, there were only around 150k results at that time. However, now it’s more than 200k. What’s the matter? Although I can’t answer it surely, but in my opinion, it’s all because the time left for this SEO contest is getting smaller. I think, there will only a month an active time to optimize it. So, whoever, especially thos who join it, will try anything to boost their SERP.

Although Oes Tsetnoc buzz is growing in SERP, but I am still lucky today. My main entry still exist in the big four. Even, once in this morning, I can see it in the third position. Considering that this situation makes an optimization becomes harder than ever, now I must make routine update in every dummies I have. This to ensure Google bot visiting mine regularly. And hopefully, he will take mine as the best page at the moment till the final day come.

Anyway, apart from the simple news about oes tsetnoc buzz that is growing in SERP above, now I am joining another local SEO contest too. It’s entitled kerja keras adalah energi kita. Sadly, it’s stuck in 6th place. So if you have any blogs that are having space to support it, please do that as soon as possible guys. I really need your help today. Will you?
read more...

Friday, October 30, 2009

learn basics before anything fancy

I heard something on the radio a few weeks ago that stuck with me and made me think about the basics of SEO. The line went something like this, “You have to have the fundamentals down before trying any of the fancy stuff. ” So before you go out and try to do things like PR sculpting or any of the other advanced techniques you hear about, get the basics down.
SEO can really be broken down into three essential areas: Architecture, Content, and Links. These are the basics of SEO that you need to understand and get right first. Let’s take a look at some key points to understand in each of these main areas.
Architecture – Can Your Site Be Crawled?
One of the first problems that a website has to address is whether or not their site can actually be crawled by the search engine spiders. You can have the best content in the world, but if the search engine spiders can’t get to it you won’t reap the benefits! Here are some things you can do to help your site be more crawl-able:
1. Avoid things like JavaScript or Flash navigation. Both of these kinds of navigation are not crawled very well by search engines at this time. This could change in the future, but for now it’s best to just avoid JavaScript and Flash navigations.
2. Keep your site’s architecture as flat as possible. Don’t have tons of levels in your architecture. Keep pages as close to the root as possible. In other words, mysite.com/folder/product is much better than mysite.com/category/subcategory/other-folder/product.
3. Stay away from parameter strings in URLs. By having parameter strings in URLs you could have multiple versions of the same content and will have to learn how to properly use the canonical element. You can avoid this by not using parameters. Instead, have a static URL for each page whenever possible. For example, mysite.com/productname.html is much better than mysite.com/?prod-id=abc123&cat-id=def456.
4. Use internal linking appropriately. Whenever it makes sense, link to other pages in your site from within the content of the page. Don’t just rely on your navigation to get people (and search engine spiders) to where you want them to go. (More about internal linking.)
5. Sitemaps are your friends. Make sure your site has both an HTML and XML sitemap. (More about sitemaps).
Content – Is It Optimized?
Once you have your website’s architecture set up the right way, the next step is to make sure that your content is well-optimized to help your site rank for your main keyword phrases. Here are a few basic guidelines to follow:
1. Don’t target too many phrases per page. You may have a list of 50 keywords you want to target, but you should only focus on 2-3 main phrases per page. Create other pages around additional phrases as needed.
2. No spammy stuff! Don’t do any keyword stuffing, alt stuffing, meta spamming, or any other spammy techniques. They don’t really work well anymore anyway.
3. Use your keyword phrases in titles, header tags, etc. By using your keyword phrases in your titles and header tags you can give them more emphasis.
4. Use your keywords in your content. Don’t just rely on your titles and header tags. Don’t overdo it; make the text read naturally but make sure you include your keywords and variations of them in the content.
Links – Getting Juice from Other Sites
Setting your site up the right way is one step, but getting traffic to your website takes a lot more than just using keywords on your pages. The other big key to getting a good rank on the search engines is to get other sites linking to you. By getting these links you are showing that your site has credibility and is worth ranking well. Here are a few quick tips to keep in mind when you’re building links:
1. Use a variety of techniques. There are a lot of things you can do to build links: directories, articles, social bookmarking, forums … the list goes on and on. Mix up what you’re doing and get a variety of link types coming into your site. (More about link building)
2. Spread your links over a lot of domains. It’s important to get a lot of links, but it’s also important to get a lot of links spread over many domains. If you follow tip #1 this shouldn’t be much of a problem for you.
3. Use keywords in your anchor text. One problem that I’ve seen over and over is that someone will build links to their site using either their name, their business name, or their URL. This is nice if that’s what you want to rank for, but if you want to rank for a keyword phrase you have to use that phrase as the anchor text of your link.
4. Use a variety of anchors. Don’t just use the same keyword phrase over and over again. Mix it up so that you aren’t spamming one phrase too much. This will help your link building look more natural.
5. The work is never done. Don’t think you can just submit to a bunch of directories and your work is over. SEO is an ongoing process.
While these tips don’t cover everything you need to know about the three main areas of search engine optimization, this is enough to get you started. Spend some time looking over your site to make sure that you are doing these basics. Then, if you want, you can try to get a little fancy.
Subscribe to our blog to receive ongoing SEO tips.
read more...

Sunday, October 18, 2009

adding arrows your quiver

Avoiding duplicates in the search engine index has consistently been a key concern we’ve heard from webmasters and site owners. Over the last few years, we have made significant strides in finding duplicates in our crawler and index algorithmically and provided webmasters with better tools for controlling these. Today we are announcing our support for a new HTML tag, the tag, which helps reduce duplicates by documenting the preferred URL form to access each page.
When you use the tag, you can indicate the canonical URL form for crawlers to use for each page of content, no matter how it was retrieved. This puts the preferred URL form with the content so that it is always available to the crawler, no matter which session id, link parameter, sort parameter, parameter order, or other source of variance is present in the URL form used to access the page.
To do this, specify a tag in the section of your page content:

The above tag indicates to the crawler that the URL it is present on should be represented canonically as http://www.example.com/products. This would eliminate the following duplicates:
http://www.example.com/products?trackingid=feed
http://www.example.com/products?sessionid=hgjkeor2
http://www.example.com/products?printable=yes&trackingid=footer
A few technical details:
• The URL paths in the tag can be absolute or relative, though we recommend using absolute paths to avoid any chance of errors.
• A tag can only point to a canonical URL form within the same domain and not across domains. For example, a tag on http://test.example.com can point to a URL on http://www.example.com but not on http://yahoo.com or any other domain.
• The tag will be treated similarly to a 301 redirect, in terms of transferring link references and other effects to the canonical form of the page.
• We will use the tag information as provided, but we’ll also use algorithmic mechanisms to avoid situations where we think the tag was not used as intended. For example, if the canonical form is non-existent, returns an error or a 404, or if the content on the source and target was substantially distinct and unique, the canonical link may be considered erroneous and deferred.
• The tag is transitive. That is, if URL A marks B as canonical, and B marks C as canonical, we’ll treat C as canonical for both A and B, though we will break infinite chains and other issues.
For several years, we have had a clear policy on handling redirects that allows you to take control of how crawlers and browsers relate between pages on your site. Another useful tool for eliminating spurious dynamic URLs and avoiding content duplication is the Rewrite Dynamic URLs feature of Site Explorer. All you need to do is authenticate your site in Site Explorer, which can now be done instantly, and then create a URL Rewriting rule. The benefit of this approach is that Yahoo! does not need to crawl your duplicate pages to discover the canonical relationships. The tag provides you with another resource to use, and is also being supported by our other partners in the Sitemaps effort, Google and Microsoft.
We recommend that you structure your site with normalized URLs and minimum duplication, or use 301s if need be. If those don’t work for you, try Site Explorer and/or the tag. Our support for the tag will be implemented over the coming months. Let us know if you have any questions on our Site Explorer Suggestion Board.
read more...

Monday, October 12, 2009

your-site-seo-hokey-pokey

”hokeyI think it’s safe to say that everyone is familiar with the Hokey Pokey. You know, that song where you put a body part in, take it out and then shake it all about. Then in the end, you put your whole self in to finish the song.
Surprisingly, this song relates really well to doing search engine optimization for your site. The problem is many people are so focused on the foot, head or arm of SEO that they never get around to putting their whole self in and getting real results!
So let’s break down the SEO Hokey Pokey so you can make sure you’re doing all of the essentials to get your site ranked for your main keyword phrases.
You Put Your Head In…
Let’s start our SEO Hokey Pokey with the head. This would include everything that goes in the head section of your website. Here are a few things to consider for this section:
Title Tag
As you’ve probably heard, this is one of the most important parts of your site for keyword usage.
You want to make sure that your titles are unique for every page and use the most important keywords. A good format to follow is this: Descriptive Keyword Based Title | Site Name or Brand.
Keep your titles to around 65 characters or else they will be truncated in the search engine results pages (SERPs).
Meta Tags
When it comes to meta tags, the most important one for search engines is the meta description. This is what most search engines will use as your description in the search results pages. Include your keyword phrases for that page and a call to action. Limit its length to around 165 characters.
For the meta keywords, either leave them blank or keep them very simple. This meta tag is not as important as it used to be, and if you put your best keywords there, your competition will easily know what your keywords are.
There are other things you’ll want to include in your header in certain situations, but the essentials really are the title and meta description. Get those right and you will have this part down. Put your head in!
You Put Your Foot In…
The foot is your foundation, and for this analogy we’re talking about the foundation of keyword-optimized content and crawlable site architecture.
If you want your site to rank well for certain keywords and phrases, you have to make sure you’re using them in your site. Otherwise, your site won’t look relevant to those phrases and will not rank well. Use your keywords in your text and link from other pages of your site with the right anchor text.
When it comes to architecture, you have to make sure that your site is set up in such a way that it is crawlable by the search engine spiders. If it isn’t crawlable, it does not matter how good your content is, no one will see it! Avoid Flash sites, JavaScript navigation bars and anything else that would impede the search engines from crawling from page-to-page.
Your foundation has to be strong, so put the time in to get it done right! Put your foot in!
You Put Your Backside In…
Your backside in our Hokey Pokey analogy is all the behind the scenes, or back-end things that go into your site. Here are a few to check on:
• Do you have a robots.txt file set up to disallow search engine spider access to parts of the site that you don’t want indexed, or that can’t be indexed?
• Do you have an HTML and XML sitemap to help the search engine spiders find all the pages of your site?
• Is the content management system (CMS) or e-commerce system you are using creating search engine friendly URLs?
A lot of the backend stuff will depend on the way your site is built, whether done page by page or with some kind of CMS or e-commerce system. Make sure you’re using one that can create a search engine friendly site. Then take the time to get your robots.txt and sitemaps created properly. Put your backside in!
You Put Your Arm In…
I’m sure you’re dying to know what the arms are in this analogy. Well, they’re the links you get from other sites!
You see, if you are going to get good rankings, you have to reach out and get links from other sites. I’m not going to go over all of the different strategies on how to do this, but this is going to be the bulk of your ongoing SEO work.
Here are a few articles from the SEO.com Blog that will teach you more about link building:
• Using Articles for Link Building
• 10 Link Building Strategies For New Website Or Business Owners
• The Value of Deep Linking
• Link Building – Finding the Right Site
Sadly, this is an area where many people either don’t put in enough time or just don’t get the right kind of links. I’ve seen a lot of sites that are very well optimized, but they are missing this critical piece. Learn how to link build and then take the time to do it! Put your arm in!
You Put Your Whole Self In…
SEO is the combination of a bunch of little things that give you a result in the end. You can’t just focus on one or two of the essential tasks and expect to get great results. Just like the Hokey Pokey, you have to get to the point where you are putting your whole self in to get results.
If you’re new to SEO, I hope this analogy helps you understand what the essentials are. If you’re an experienced SEO, now you have a dance you can do while you work! Now do the SEO Hokey Pokey!
read more...

Sunday, October 11, 2009

SEO Tips for Beginners

The rise of the shots keyword about seo training for beginners in google, gives an inspiring me to just share SEO Tips for beginners. Sharing of tips ranging from learning seo seo unity series to learn the easy seo second series I've ever written. And core 2 seo of these postings I have also been described: D. But when I look around again, it "seo tips" was not for "beginners": D. Before you read more about my posting "seo tips for beginners" is a good idea once you know what SEO in wikipedia. Because it's useless to read my writing seabrek gini, kalo aja SEO meaning belom know: P. Okay father ... mother ... ya ... mas .... And all the viewers who are reading this blog: D. After understanding about SEO be continued please read this posting.

Well tips best seo for bloggers who are just starting in the world of blogging is as follows:

1. Give your article title within their keywords you want to optimize or timbak. For example I just want to be No. 1 in the seminar blog on new york city. Well given the title of the article just like that.

2. Make the content or the content of the article according to your abilities. Do not plow, do not copy and paste. Original must be on the basis of your own mind:).

3. Update or update your articles regularly and routinely. For instance, usually cuman 1 week 1 time update it to 1-week routine update the article. But the best are the updated your articles every day 1 article: D.

4. Give the difference in each of the keyword or keywords in your articles. Whether it's a way to bold, colorful gave in each of the keywords that will in optimizing the search engines.

5. Look for Backlink whether it's through free classifieds, social media, up to search for blogs and then fitted to the blogroll. It's up to you just reply to this one: D.

As one of the builders Blogger timbak kiwod in Indonesia: pardon:, apparently in 2008 it was worth it if called as a carpenter timbak year-menimbak. The number of Blogger who specializes in SEO in 2007 kalo era can still be counted pake fingers + toes 8 -) in the year 2008, I borrowed her fingers + toes mas iwan ga alone can create a count of seo experts. Maybe because the team victory in the race IMFreakz "Busby SEO Challenge" is the first milestone that timbak artisans in Indonesia can not be underestimated. As well as evidence that the builders could timbak "The Indonesian people it the name". We pray it in the contest Busby SEO Test, the team can win timbak Indonesia.

Okay ga not much talking and a lot of words again: D. Good luck with seo tips for beginners from me:). Seo tips over a lot easier (reply to me) than to love meta tags, or other means. Welcome tried and successful.
read more...

Wednesday, September 30, 2009

Killer keywords and domain

Not too long ago, wrote a post about the importance of selecting a strong domain name and a few tips to help you come up with good domain names. I recently came across a great example of how much added value having keywords in your URL can bring to your SEO efforts.
I have a friend that registered the domain name coldplaymusic.net. The .com was already bought and was parked by some domainer, so he settled for the .net. If you do a search for “Coldplay music” you will find that coldplaymusic.net is in the second page of results. This is the shocking part… The domain name was purchased about a month ago. There have been no links built to this site, and there is not a lot of content. Even with these aspects working against its rankings, the site has still managed to make it on the second page for a search with a volume of over 200 thousand searches each month.
The content on the site has been optimized, so I can’t give all the credit to the domain. However, it does suggest the importance of having keywords in your domain. Because having keywords in a domain has helped web sites show up for in particular keywords, some domains become quite valuable. Some of you might remember that the domain Pizza.com sold last year for over two million dollars. The previous owner purchased it fourteen years ago for twenty dollars. Now that is what I call a good ROI.
As with everything, you need to be moderate in how you place keywords in your domain. You are limited to 63 characters or less in your domain (this does not include the .com, .net, or other country codes). Also, you should remember that if you have a domain like cheapviagragamblingpornreplicawatchesandmoneymakingschemes.com it will definitely look like spam to the search engines. Keep an eye out for my next post that will provide other tips to making your domain work for you.
read more...

Thursday, September 24, 2009

i put that seo button

For all you people out there who are looking to do SEO, I have some seriously bad news. You might want to sit down for this. Ready? There is no instant SEO button. I’m sorry to have to bear this bad news to you, and I hate having to be the one to break it to you. There’s no switch either. Or simple form to sign, trick to use, or connections you can have with people on the inside. If you want to be there in the top of the field with the best sites, you can’t just call up Google and say “Ok I’m ready.” It takes work and it takes time.
Even though this is fairly well known by now, it’s tempting to think that SEO is that simple. Regardless of how good your SEO firm is, you still have to be a relevant site. Even then, it will still take work and effort to get you to the top of the search engine rankings.
It will happen, from time to time, that a new site will get into a contract for SEO and stop their own developments, essentially filling that one basket with all the eggs. This actually makes things more difficult for the SEO firms. The fact of the matter is that we don’t suddenly make your site more relevant to people searching on your terms. We work to make it so that Google can see your site better so they can decide how relevant you are. We will make suggestions on how to make your site more relevant to your users. A hard, but necessary question to ask yourself is whether your site really is the best site to show up for the given keyword. If it’s not, perhaps you need to make some adjustments.
Here are a few tips to making your site the best site available for your keywords:
• Make sure you have some method of keeping your site up to date, and a source of information (where possible). A blog, or news section works well for this idea.
• Don’t fall too in love with the overall design and look of your site. Be willing to make changes, and reorganize and restructure how the site works.
• Consider keywords that don’t have corresponding pages. Perhaps pages need to be created to fit that missing piece.
• Most importantly, continue to develop your site like you would your business.
In the end, having a website that people want to find makes SEO that much better and faster. Working with your SEO service provider to make sure that you really are the best site out there will do wonders, and not to mention make a lot of people happy–including your SEO firm
read more...

Friday, September 11, 2009

How to choose the right keywords

Choosing keywords that are baikMemilih keywords that work focused on what people search is not about what the product, article or your content.
For example, people would prefer œharga € â € hotelâ room?? than œHotel Mutiaraâ â € €??, so use the name of the company less in terms of what people search for, from the point of search enggine certainly no problem because both kewyords will be index by google. So once again, focus on what is generally sought.
To specify keywords and find out what people find most appealing, you may not get the right keywords if you do it by thinking hard or over coffee. So a better and more quickly and accurately if you use an existing tool, there is a link under the keywords tool you can use:

inventory.overture.com
Overture facility you can enter a keyword you will be given the number of keyword searches in the last months. You will also be given an alternative list of relevant keywords and the number of searches for each of these alternatives.

Wordtracker.com
This tool has the advantage because it already has a thesaurus, so you will be given keywords commonly used by people in search for things related to keywords you provide. With this you can choose a more appropriate keywords from the keywords you enter.

adwords.google.com
A tool from Google that actually made to advertisers on google adwords. But the toll is very useful to get the best keyword choices, you can also peek of how competitive the keywords that advertisers use. In addition, ajax-based tool that is very easy to use to select keywords and you can export directly to the excel file.

Complete tool list for keywords you can see in the tool category.
read more...

Sunday, September 6, 2009

killer robots

If you haven’t heard of Mr. Robots, don’t blame yourself. It wasn’t even on the SEO map till just a couple years ago. Most of you, however, know what it is but don’t know exactly how to dominate the robots.
Robots.txt files are no secret. You can spy on literally anyone’s robots file by simply typing “www.domain.com/robots.txt.” The robots.txt should always and only be in the root of the domain and EVERY website should have one, even if it’s generic and I’ll tell you why.
There’s mixed communication about the robots. Use it. Don’t use it. Use meta-robots. You could have also heard advice to abandon the robots.txt all together. Who is right?
Here’s the secret sauce. Check it out.
First things first, understand that the robots.txt file was not designed for human usage. It was designed to command search ‘bots’ about how exactly they can behave on your site. It sets parameters that the bots have to obey and mandates what information they can and cannot access.
This is critical for your sites SEO success. You don’t want the bots looking through your dirty closets, so to speak.
What is a Robots.txt File?
The robots.txt is nothing more than a simple text file that should always sit in the root directory of your site. Once you understand the proper formats it’s a piece of cake to create. This system is called the Robots Exclusion Standard.
Always be sure to create the file in a basic text editor like Notepad or TextEdit and NOT in an HTML editor like Dreamweaver or FrontPage. That’s critically important. The robots.txt is NOT an html file and is not even remotely close to any web language. It has its own format that is completely different than any other language out there. Lucky for us, it’s extremely simple once you know how to use it.
Robots.txt Breakdown
The robots file is simple. It consists of two main directives: User-agent and Disallow.
User Agent
Every item in the robots.txt file is specified by what is called a ‘user agent.’ The user agent line specifies the robot that the command refers to.
Example:
User-agent: googlebot
On the user agent line you can also use what is called a ‘wildcard character’ that specifies ALL robots at once.
Example:
User-agent: *
If you don’t know what the user agent names are, you can easily find these in your own site logs by checking for requests to the robots.txt file. The cool thing is that most major search engines have names for their spiders. Like pet names. I’m not kidding. Slurp.
Here some major bots:
Googlebot
Yahoo! Slurp
MSNbot
Teoma
Mediapartners-Google (Google AdSense Robot)
Xenu Link Sleuth
Disallow
The second most important part of your robots.txt file is the ‘disallow’ directive line which is usually written right below the user agent. Remember, just because the disallow directive is present does not mean that the specified bots are completely disallowed from your site, you can pick and choose what they can and can’t index or download.
The disallow directives can specify files and directories.
For example, if you want to instruct ALL spiders to not download your privacy policy, you would enter:
User-agent: *
Disallow: privacy.html
You can also specify entire directories with a directive like this:
User-agent: *
Disallow: /cgi-bin/
Again, if you only want a certain bot to be disallowed from a file or directory, put its name in place of the *.
This will block spiders from your cgi-bin directory.
Super Ninja Robots.txt Trick
Security is a huge issue online. Naturally, some webmasters are nervous about listing the directories that they want to keep private thinking that they’ll be handing the hackers and black-hat-ness-doers a roadmap to their most secret stuff.
But we’re smarter than that aren’t we?
Here’s what you do: If the directory you want to exclude or block is “secret” all you need to do is abbreviate it and add an asterisk to the end. You’ll want to make sure that the abbreviation is unique. You can name the directory you want protected ‘/secretsizzlesauce/’ and you’ll just add this line to your robots.txt:
User-agent: *
Disallow: /sec*
Problem solved.
This directive will disallow spiders from indexing directories that begin with “sec.” You’ll want to double check your directory structure to make sure you won’t be disallowing any other directories that you wouldn’t want disallowed. For example, this directive would disallow the directory “secondary” if you had that directory on your server.
To make things easier, just as the user agent directive, there is a similar wildcard command for the disallow directive. If you were to disallow /tos then by default it will disallow files with ‘tos‘ such as a tos.html as well as any file inside the /tos directory such as /tos/terms.html.
Important Tactics For Robot Domination
• Always place your robots in the root directory of your site so that it can be accessed like this: www.yourdomain.com/robots.txt
• If you leave the disallow line blank, it indicates that ALL files may be retrieved.
• User-agent:*
Disallow:
• You can add as many disallow directives to a single user agent as you need to but all user agents must have a disallow directive whether the directive disallows or not.
• To be SEO kosher, at least one disallow line must be present for every user agent directive. You don’t want the bots to misread your stuff, so be sure and get it right. If you don’t get the format right they may just ignore the entire file and that is not cool. Most people who have their stuff indexed when they don’t want it to be indexed have syntax errors in their robots.
• Use the Analyze Robots.txt tool in your Google Webmaster Account to make sure you set up your robots file correctly.
• An empty robots is the exact same as not having one at all. So, if nothing else, use at least the basic directive to allow the entire site.
• How to add comments to a robots? To add comments into your robots, all you need to do is throw a # in front and that entire line will be ignored. DO NOT put comments on the end of a directive line. That is bad form and some bots may not read it correctly.
• What stuff do you want to disallow in your robots?
o Any folder that you don’t want the public eye to find or those that aren’t password protected that should be.
o Printer friendly versions of pages (mostly to avoid the duplicate content filter).
o Image directory to protect them from leeches and to make your content more spiderable.
o CGI-BIN which houses some of the programming code on your site.
o Find bots in your site logs that are sucking up bandwidth and not returning any value
Killer Robot Tactics
• This set up allows the bots to visit everything on your site and sometimes on your server, so use carefully. The * specifies ALL robots and the open disallow directive applies no restrictions to ANY bot.
User-agent: *
Disallow:
• This set up prevents your entire site from being indexed or downloaded. In theory, this will keep ALL bots out.
User-agent: *
Disallow: /
• This set up keeps out just one bot. In this case, we’re denying the heck out of Ask’s bot, Teoma.
User-agent: Teoma
Disallow: /
• This set up keeps ALL bots out of your cgi-bin and your image directory:
User-agent: *
Disallow: /cgi-bin/
Disallow: /images/
• If you want to disallow Google from indexing your images in their image search engine but allow all other bots, do this:
User-agent: Googlebot-Image
Disallow: /images/
• If you create a page that is perfect for Yahoo!, but you don’t want Google to see it:
User-Agent: Googlebot
Disallow: /yahoo-page.html
#don’t use user agents or robots.txt for cloaking. That’s SEO suicide.

If You Don’t Use a Robots.txt File…
A well written robots.txt file helps your site get indexed up to 15% deeper for most sites. It also allows you to control your content so that your site’s SEO footprint is clean and indexable and literal fodder for search engines. That, is worth the effort.
Everyone should have and employ a solid robots.txt file. It is critical to the long term success of your site.
Get it done.
Bling.
read more...

Tuesday, August 18, 2009

misspelled keywords good luck!

Ranking well for a popular keyword phrase is getting harder and harder these days. But did you know that about 10 million times a day someone misspells a keyword in their search? Normally to those poor saps I would say, “Hey, good luck with that!” But the truth is, more and more online companies are taking advantage of the misspelled keywords.
For example, if your website is selling office calendars, the average monthly search for “office calendars” is about 40,500. The term “office calenders” is searched about 1,300 times per month. What does this mean for you? Well, even though the misspelled term searches are significantly less (only 3% of the correctly spelled term’s search volume) you will not need to fight as hard to get to the top of this search, thus bringing in some extra traffic from those bad spellors that u may knot hav otherwize bin kounting on.
So what is the best way to optimize for these misspelled keywords? The first trick is to know what your options are for the possible misspellings. A great tool to use for this is the Seobook.com Typo Generator—although, in the 146 possibilities this tool gives for the word “Calendar,” they don’t even mention “Calender.” So, you will want to check out more than one source. Another source is checking the “100 Most Often Misspelled Words in English” from yourdictionary.com or other dictionary sites. Make sure to check the misspelled word’s estimated search traffic to see if it is worth optimizing for.
Next you’ll need to optimize for the misspelled term on your page. There are several ways of accomplishing this, but I believe one of the classiest methods is to simply post an article or blog post on your site that mentions different common misspellings of the desired term. You can’t base your entire SEO strategy on optimizing for misspellings. Google often times helps out poor spelling patrons showing the “Did you mean…” option above the search results.
Well, there you have it—a new arrow in your quiver. Just make sure to take some time to do a little research (Google AdWords keyword tool) before spending energy on the optimization. Often times, it’s hit or miss with these misspellings—so, good luck with that!
read more...

Wednesday, August 12, 2009

4 Tips to exchange links

In exchange links, there are some things that must be considered. Here are 4 tips to exchange links-style sibaho:

1. Apple to Apple or Orange to Orange. That is, the placement of links to match. Index Page (Home) vs. Index Page. Do not Index Page opponent Post Page (page post). That's not fair name, because it will provide a different quality backlink. I several times to ignore the invite friends to exchange links, because after my survey, were collected in a blogroll special post. But I realized, it certainly was not intentional element of the friends ...
2. Give and Take. Who took first pair of links. If you are invited to ultimately do not want, to stay deleted. If you are invited there is no response, confirm again, until the time limit considered, delete it. There are many other blogs kok invited to exchange links:)
3. Look forward, not now. Exchanging links is like investing. Do not dwell only on this blog that had a high Pagerank. It takes great effort and a high failure potential. Why not ask to exchange links with blog newbie but potentially in the future. In addition to easier, it also will train our intuition to see the potential of a blog:))
4. Friendship above all. If you do not want invited to exchange links, do not break the charcoal and then immediately left the blog. Ninety percent of the links that I put in a blogroll is not due to exchange links, but because of friendship from blogwalking activity. So quite confusing just someone once commented on our blog, next thing I knew immediately invited to exchange links.
read more...

Saturday, August 1, 2009

your-conten-link-thirsty

Lets face it. Link building is drab work. It’s the kind of work most of us like to hand off to the”‘next guy.” Nevertheless, I firmly advocate its role in Search Engine Optimization. Google’s key indicator of the “good,” “better,” and “best” websites depends on number of backlinks, and where those links are coming from.
I assume you, the reader, have already figured this out. An SEO firm without offering link building strategies to its clients might as well pack up. A site with no links is like a telephone pole with no wires—useless. Consequently, instead of asking “if” link building should be done, online marketers are always asking “how” it will be done. What’s the best strategy?
A relative of mine created an informative website, after research, about “Good Security Questions.” That was over two years ago. The site lacked all the bells and whistles that come with newer web 3.0, but it was content-rich, and more importantly, there was a niche market available who needed the information the site provided.
He didn’t know much about Search Engine Optimization. He didn’t think much about keyword research or quality link building. I spoke to him recently about the results he’s had lately. His site now ranks #1 for “security questions.” He also told me some larger corporate websites such as American Express, Delta, Prudential, and ING Direct recently changed their security questions to match the list of questions he had written on his website. People are also beginning to link to his site and use his content at the cost of little to no SEO effort on his part.
His results reveal an old rule that’s been around ever since the first neanderthal man showed his prehistoric friends how to make fire. If you want people to listen to you, say something they want to hear. Thus, before you spend time and money developing an extensive link building strategy, first, provide that your content is “link-thirsty”—make it useful, interesting, timely, or outrageous (see some of Adam’s tips about Buzz Marketing)!
Even better, include “quality content” as the first priority on your link building plan. Indirectly, appropriating your content (appropriate: suitable or fitting for a particular purpose, person, occasion) is, by far, the most successful link building strategy on the planet. If done right, link building will work for you—no more mindless directory submissions. At a bonus, you’ll get traffic from relevant sources–which means higher conversion rates. Here are a few tips to appropriate your content (This stuff isn’t breaking news…just common sense principles of which many companies fail to apply):
• Research and know your topic. This shouldn’t be difficult because you are already a guru of your industry.
• Be informative and as detailed as possible. (Word of caution: Quantity is good as long as you know how to present it in a comfortable, readable manner. Use lists, headings, and varied amounts of italicizing and bolding. Remember, people like to learn, but they don’t like to read)
• Link out to your sources. Google looks for links on your site as extra avenues to gather further information on your topic. Don’t be a dead end on the web.
• Make lists and tables. People like to gather information fast. Tables make it easy to compare items. You could even provide an objective microsite comparing your product with your competitors’ products (this only works if you, honestly, have the best product/service on the market).
• Tell stories and be narrative. This adds flavor to your content. Write with flamboyant, playful, or exciting language. Don’t always be so serious. People want to know there is a human being behind your content and your business.
• Be original. Are you telling something people already know? If it’s already known, take a new twist or add your own opinion. (Don’t copy content from other sites! First, this is plagiarism and second, Google doesn’t like duplicate content).
• Involve Users. Yes, you will need to jump on the Web 3.0 bandwagon. Provide means to comment, write reviews, vote on items, etc.
• Check your spelling and grammar. Nothing is better at killing your credibility than poor writing. Hint: Get a professional editor to check your work.
read more...

Thursday, July 30, 2009

optimizing-for-navigational-searches

People search for many different reasons. These myriad reasons can be broken down into a few wide segments, informational, transactional, and navigational searches. There are millions of searches when people simply need information, but often even informational searches are just the early stages of the purchasing process. It is important to understand the different ways and reasons the people search and be sure that your site is optimized to be found at each stage.
The type of searches that SEO people rarely talk about is the navigational searches. Navigational searches are search queries where people search for your exact domain name or brand name. Basically, they know exactly what they are looking for, but they choose to search to find it instead of typing it into the browser address bar.
Don’t think navigational search is a big deal? Check out the top 10 overall searches for July 2008 from Hitwise: myspace, craigslist, ebay, myspace.com, youtube, mapquest, yahoo, facebook, www.myspace.com, craigs list
Compete data shows 17% of searches are navigational. Why are there so many searches for domain names and site names? Many people will enter the URL or site name into the search box and it magically appears as the top link and the click on it. Sure, it adds an extra click that really isn’t necessary, but that’s how they learned to find things on the web so why change if it works?
As marketers, we need to understand that this is a huge part of the way people navigate the web and we need to be sure our sites are optimized to show up when people are looking for us.
Domain Name Searches
First off, make sure your site is indexed in the search engines. Does your site show up first in a search for your domain name? Unless your site is penalized, it should show up at the top of the results when you search for the domain name. Follow SEO best practices with regards to your website architecture and content. Don’t hide your content behind javascript or Flash navigation. Include a link to your HTML sitemap in the footer of your site. Make sure your XML sitemap is updated and submitted to the search engines. Be sure you are blocking any directories and pages of your site that you don’t want indexed, but be careful to not deny the search bots from indexing the pages you want them to find. Adding a / in the wrong place in your robots.txt file can get your site removed from the search engines.
Company/Brand Name Searches
Brand name searches can be an interesting challenge. It seems like it would be so obvious to the search engines and your site would automatically show up at the top for all searches for your brand name. Sometimes that is the case, but the larger your brand is, the more competition you will have for your own name. This could come in the form of affiliates, review sites, news articles, press releases, and many other types of pages. Usually it is fine to have that stuff showing up, as long as your official site shows up at the top. To make sure it does show up, be sure to feature your brand name(s) prominently on the site in textual format, not just graphics. Most of the time you will have plenty of links to your site using your brand name as the anchor text. If your site isn’t on top, however, or if you have a newer site or brand, it might take a while to get enough link juice to move to the top for your own name.
Another issue that I’ve seen on occasion is when you run into competition from other companies or products that have the same or similar brand name. They might be in a completely different market, but if they have the same name, you could have a hard time beating them for your own name if they are more established and have more link equity than your site. Another thing to consider with name searches is misspellings or spelling variations. You usually won’t want to look stupid by misspelling your own company name in the content on your site, but a few low-profile links with the misspelled version will often do the trick. Sometimes Google figures out what they meant to type and serves up your site anyway, so you should check to see what shows up for common misspellings.
International Search Engines
If you have content catered to an international audience, you should make sure your site is showing up for brand/domain searches in that country. Google is the biggest search in most countries, except Baidu in China, Yahoo in Japan, and Yandex in Russia. Make sure your site is listed and shows up for your name in the top 3 search engines for the countries you are targeting. The easiest way to get top billing in the country-specific search engines, including pages from country searches in the localized Google search, is to have a separate, localized site on the country’s preferred TLD. You can also set up international sites on subdomains or sub directories of your main domain. Then you can specify to Google which country each subdomain or directory pertains to.
Product Name Searches
Although not exactly navigational searches per se, I wanted to mention product name searches. People might search for a product name or SKU when they know exactly what they want to buy. They might be comparing prices or just looking for the best place to purchase. Or they may be looking for reviews and feedback from other purchasers of the same item. Or it could be existing customers looking for support information or accessories for their product. Whatever the reason, you would be well served to have your product pages showing up for these keywords. The best approach will depend on how competitive your products are. If you are the only one selling the product, all it will take is to get the pages indexed. If you are competing against thousands of other resellers, it will prove more difficult and you’ll need to do some serious optimizing and link building to those specific pages. Start off with the basics of getting the pages indexed. Spend some time searching for your products and see what you find. Analyze the level of competition and put together a plan to get your pages to show up on the first page. You might not get a ton of searches on any single page, but if you sell thousands of products, the aggregate traffic and sales from those keywords will make a big impact.
PPC for Navigational Searches
You should be able to show up for all of your brand terms without having to pay for the traffic. One thing to consider however, is that if you augment your organic SEO with paid listings, you increase the chances of getting people to click through to your site. This can be especially important if you have competitors bidding on your brand or other navigational searches. You want to do everything you can to make sure they get to your site and not your competition. On the flip site, you could use the same strategy to try to capture some of the navigational search traffic to your competitors by bidding on their brand names and offering a compelling alternative product or service.
More than Just Rankings
One last thing I wanted to mention about optimizing for navigational searches. Showing up for your domain and brand names is just the first step. Your really want as many of those searchers to click through to your site, so you should pay close attention to the title and description snippet that show up in the search listings. If you have a number one ranking, but no title or description, it will be easy for searchers to skip it for the more compelling link right below.
read more...

Wednesday, July 1, 2009

the competitive analysis

When I was in middle-school, comic books were AWESOME. Looking back, I notice now that in nearly every comic book, eventually there was the villain that could take everybody’s powers and use them as his or her own. The uncanny X-Men battled the aptly-named Mimic; the Avengers and the Fantastic Four both took a shot at the extra-terrestrial, mega-warrior called the Super Skrull; and more currently the TV show Heroes has a creepy fellow who can do more of the same thing. The list goes on. Feeling geeked out on super-powered goodness? This is still an SEO blog. And you should know that the common ability of these villains apply to SEO.
Think of your strongest competitors on the internet. Why are they competitors? Is it because their site has a lot of useful information? Is the site multilingual? Do they have links from big news sites like CNN, or from government or university sites? Is their blog well thought out and and a leading trusted voice in the industry? Are they taking advantage of new media and social media avenues? What do they have that gives them an edge? What do they do that makes them weak?
These are some of the major questions to ask when performing a competitive analysis. They tell you what makes your competition special and give you information necessary for competitive search engine optimization. These answers are followed next with the question that touches the heart of competitive analysis: How can I do what they are doing, but better?
Now, I know what you are thinking right now: Sure, there’s money to be made as a super villain, but what about self-respect? Integrity?
The good news is that you can use your new-found powers garnered from competitive analysis for good. Competitive analysis isn’t about copying. Improve upon your competitors’ ideas, avoid their weaknesses, and build their strengths onto your own site’s ideas, structure and philosophy. This is the best way to stand out as most relevant to your potential visitors. The problem with the aforementioned villains (insert black-hat SEO) is that they’re frequently using their powers to deceive and control people in ways that shouldn’t be done. This sort of villainy will always lose out to the quality and relevance of a site properly strengthened with competitive analysis.
Here at SEO.com, competitive analysis is one of the first things we do with a new client. We look for strengths, weakness and give you suggestions on how to implement these in ways that will increase your level of search engine optimization. Then we periodically keep an eye on the competition to see how (if) your competitors are innovating in their approach to generating traffic, appealing to search engines, or making conversions. With greater power and strength than your competitors how can you not succeed?
read more...

Monday, June 1, 2009

seo-questions-and-site-reviews

Got a question about SEO? Wish someone would tell you if your site is properly optimized for the search engines? We’re launching two new features here on the SEO.com blog: SEO Question/Answer and SEO site reviews.
The way the Question/Answer part works is you ask a question about any topic related to online marketing, SEO, SEM, social media, etc. We probably won’t be able to answer every single question, but we’ll do our best to get most of them and at least respond even if we’re not able to post your question to the blog.
The second new feature is the SEO site reviews. You can submit a site to be reviewed by our team of SEO experts. We will analyze the site and make recommendations for what we would do if we were engaged to help optimize the site.
Any questions or site review requests can be submitted through our contact form.
I’m excited about these new features, and I hope you will take advantage of this offer for a little free SEO advice.
read more...

Sunday, May 24, 2009

seo-is-about-communication

We’ve all heard various experts toot the horn of various elements of SEO: “Content is king!” or “It’s all about the links.” Others tout their proprietary software tools or throw out terms like latent semantic indexing. The truth is, all of these things play a part in SEO (along with a bunch of other things), but if you focus on any one of these, you’re missing the bigger picture.
SEO is about communication!

Communication is defined as: a process by which information is exchanged between individuals through a common system of symbols, signs, or behavior.
The communication process involves a sender and a receiver. As long as both the sender and receiver both understand the common system of symbols, signs, or behavior, the sender’s message will be understood. Sounds simple enough, right? The challenge with any communication is that the message has to be properly encoded and then passed along through the communication medium or channel and then decoded and interpreted by the receiver of the message. With search engine optimization, we are dealing with an intermediary (the search engines) who really aren’t our intended receiver at all. We have to make sure the search engines understand the message we are sending, or the communication process breaks down and never makes it to our intended receiver. This makes for a much more complicated communication process.
Google has made it perfectly clear that they are making every possible effort to improve the quality of its search engine. This means they are doing their best to decode the meaning of web pages and other web content — at the same time they are decoding the meaning of users’ search queries — all in an attempt to match up searchers with what they are looking for. They are trying to improve the web search communication channel. Our job as communicators is to create our websites and write content that makes it clear what we are trying to say. If we send the right signals to Google through the structure and content on our site, internal and external links, and all the other “seo” tactics, Google will get the message and move our site to the top of the search heap.
Don’t forget that getting Google to properly decode and interpret our message is only part of the communication process. Google is not the final receiver of our message, remember? Ultimately you want your customers to find your site, click through and complete the desired action (lead, sale, call, whatever).
Just getting your site to the top of the search engine for your desired keywords does not mean your intended customers will get your message. If your title and description is just a bunch of mumbo jumbo, you won’t get many people clicking through. Take care to use titles and descriptions that will convey the right message regarding what you have to offer. Once they click through to your site, the real communication takes place. If they don’t immediately see what they were looking for, they will back out and look elsewhere. If your message is not persuasive enough, the customer will move along to find something more convincing. Provide enough content and enough value to satisfy the searcher’s desire to find what they are looking for. Speak the same language as your site visitors and they will be much more likely to decode your message in the way it was intended.
An important part of the communication process is feedback. Try to get feedback from your website visitors. Obviously, when you get get a sale on your website, you know your message was successfully received (at least by that person). What if you’re not getting any sales? Or what about the other 95% of the people who didn’t buy? Provide other feedback mechanisms to allow your customers to give you feedback about your message. Provide clear calls to action and give them multiple ways to contact you and provide feedback. Use your website analytics reports along with tools like ClickTale and CrazyEgg to find out your site visitors’ behavior. Spend the time to close the feedback loop so you can refine and improve your communication process.
Don’t ever forget that SEO is about communication and the search engine is merely the medium to reach your desired recipient.
read more...

Tuesday, May 19, 2009

transformer-of-seo

I have yet to see the sequel to Transformers but I grew up playing with transformers as a little boy and I loved the first movie. Seeing the sequel is one of the top things on my list of things to do this summer. I have watched many trailers for it and it looks like it’s going to be just as good, if not better, than the first one.
When it comes to SEO, there are a few things that can be done to transform a campaign. These are what I call the transformers of SEO. Instead of transforming from a car into a robot, these elements, when properly optimized, will transform a good SEO campaign into a great one. When not used properly, or not used at all, getting top rankings may seem to be next to impossible.
Title Tag (Optimus Prime):
The title tag is like Optimus Prime; it can be the anchor of a well-optimized website. The title tag appears at the top of the browser when browsing the web or visiting a site. Of all the HTML elements, the title tag carries the most weight in search engine algorithms. Properly optimized titles should contain keywords that are relevant to the page, and should be no more than 70 characters long. It should be a well-written sentence formed from the targeted keywords and should not be just a list of keywords.
Meta Description Tag (Ironhide):
The Meta description is like Ironhide, and can give the website a tactical advantage when used properly. When optimized, this tag can increase the click-through-rate of a site listing in the search results. The Meta description tag is not displayed on the website page. It is placed in the head section of the HTML code, usually right below the title tag. Meta Descriptions have no influence on search engine rankings. However, if if they contain the same terms of the search query, then they are normally displayed for the description in the search results.
A good Meta description should contain the targeted search terms and describe the website page. It should be about 160 characters long and contain a call-to-action that will entice a searcher to visit the site. Meta descriptions should not be a long string of keywords separated with commas. This is referred to as the Meta keywords tag, not the Meta description.
Internal Links (Bumblebee):
Bumblebee had an important role to play as an Autobot. Like Bumblebee, internal links have a critical role in an SEO campaign, and can impact site rankings if optimized. To optimize a Website’s link structure, the site should have navigation using HTML text links. The links can be formatted to appear as buttons using cascading style sheets. They should use keyword targeted anchor text and should match the targeted keywords of the page the link is pointing to on the website. Links created with JavaScript, Flash or Image rollovers without optimized anchor text, or the ability to be crawled by spiders, should be avoided.
A good SEO campaign should not neglect the optimization of any of these site elements. Including them can often make the difference in a second page or a first page ranking.
read more...

Thursday, May 14, 2009

transformer-of-seo-the-decepticons

Before anyone asks the question, the answer is: “Yes, I finally got out to the theater to see Revenge of the Fallen.” About 10 years ago life was much simpler. I could just pick a night, meet some of my friends and head out to the theater to see a movie. Now I am married and have a couple of kids. Getting out to the theater has become a rare event, but I did manage to make time over the weekend to see it with my brother in-law.
Thanks to a couple of comments from Princess Zelda and Dan Schulz on my previous post about Transformers of SEO, I have had some inspiration to write a sequel. My previous post compared a few optimization strategies to the Autobots. This post will contrast some of the current blackhat SEO strategies to the Decepticons. I think this comparison is appropriate because, in the end, blackhat SEO only “deceives” the person using it by making them think that it will bring top rankings and instant wealth.
Cloaking (Megatron):
Megatron is the leader of the Decepticons and one of the most powerful Transformers. Just like Megatron, cloaking can be a powerful blackhat SEO tactic that could result in high search engine rankings. The negative aspect of this tactic is that once a site has been caught cloaking it is removed from the search index and all the top rankings are instantly taken away. Cloaking is done by making a website appear to be something completely different for site visitors than it does for search engine spiders. There are many ways a web developer can go accomplish it, but whatever method is used, the end result is always the same.
Doorway Pages (Devastator):
Several Decepticons combined together to transform into one large robot called Devastator. When combined together they were a powerful and formidable opponent. Doorway pages can be compared to these Decepticons. One doorway page on its own can be an effective tool for driving traffic to a website, but multiple doorway pages can have a “devastating” effect. Doorway pages are single web pages that are set up and optimized for a couple of search terms with the singular goal of sending people from that page to a main website. Creating several doorway pages targeting multiple search terms can eventually help a webmaster rank well for all their targeted search terms. Doorway pages are just another way of tricking or deceiving the search engines. The goal of search engines is to provide the searcher with the most relevant websites to their search query. They want those websites to be what is displayed in the search results and not a page that directs a visitor to another site. Once doorway pages are discovered, they will be removed from the search index and all the work of creating them will be for nothing.
Keyword Stuffing (Starscream):
Starscream is the sidekick to Megatron but always seems to be more like a thorn in Megatron’s side. He was useful when he assisted the Decepticons in battle but was usually more of a hindrance then a help to Megatron because he was constantly plotting to overthrow him and take over as leader. I think keyword stuffing is much like Starscream. Many webmasters stuff webpage elements like titles tags, alt attributes and headers, full of keywords to either target many different search terms or inflate keyword density. While this strategy may have worked in the past, today it only results in reducing the weight of the elements that have been stuffed. So instead of possibly ranking well for a few targeted search terms, the site will struggle to rank for any search terms at all. Keyword stuffing is a strategy of the past that is dead and gone. Anyone practicing it today can expect their SEO campaign to also be dead and gone.
Don’t be fooled by any of these blackhat strategies, they deceive webmasters into believing they can bring quick rankings and wealth, but in the end it is the ethical SEO techniques that will bring the lasting results.
read more...
 
seo Copyright © 2009 Blogger Template Designed by Bie Blogger Template