Wednesday, September 30, 2009

Killer keywords and domain

Not too long ago, wrote a post about the importance of selecting a strong domain name and a few tips to help you come up with good domain names. I recently came across a great example of how much added value having keywords in your URL can bring to your SEO efforts.
I have a friend that registered the domain name coldplaymusic.net. The .com was already bought and was parked by some domainer, so he settled for the .net. If you do a search for “Coldplay music” you will find that coldplaymusic.net is in the second page of results. This is the shocking part… The domain name was purchased about a month ago. There have been no links built to this site, and there is not a lot of content. Even with these aspects working against its rankings, the site has still managed to make it on the second page for a search with a volume of over 200 thousand searches each month.
The content on the site has been optimized, so I can’t give all the credit to the domain. However, it does suggest the importance of having keywords in your domain. Because having keywords in a domain has helped web sites show up for in particular keywords, some domains become quite valuable. Some of you might remember that the domain Pizza.com sold last year for over two million dollars. The previous owner purchased it fourteen years ago for twenty dollars. Now that is what I call a good ROI.
As with everything, you need to be moderate in how you place keywords in your domain. You are limited to 63 characters or less in your domain (this does not include the .com, .net, or other country codes). Also, you should remember that if you have a domain like cheapviagragamblingpornreplicawatchesandmoneymakingschemes.com it will definitely look like spam to the search engines. Keep an eye out for my next post that will provide other tips to making your domain work for you.
read more...

Thursday, September 24, 2009

i put that seo button

For all you people out there who are looking to do SEO, I have some seriously bad news. You might want to sit down for this. Ready? There is no instant SEO button. I’m sorry to have to bear this bad news to you, and I hate having to be the one to break it to you. There’s no switch either. Or simple form to sign, trick to use, or connections you can have with people on the inside. If you want to be there in the top of the field with the best sites, you can’t just call up Google and say “Ok I’m ready.” It takes work and it takes time.
Even though this is fairly well known by now, it’s tempting to think that SEO is that simple. Regardless of how good your SEO firm is, you still have to be a relevant site. Even then, it will still take work and effort to get you to the top of the search engine rankings.
It will happen, from time to time, that a new site will get into a contract for SEO and stop their own developments, essentially filling that one basket with all the eggs. This actually makes things more difficult for the SEO firms. The fact of the matter is that we don’t suddenly make your site more relevant to people searching on your terms. We work to make it so that Google can see your site better so they can decide how relevant you are. We will make suggestions on how to make your site more relevant to your users. A hard, but necessary question to ask yourself is whether your site really is the best site to show up for the given keyword. If it’s not, perhaps you need to make some adjustments.
Here are a few tips to making your site the best site available for your keywords:
• Make sure you have some method of keeping your site up to date, and a source of information (where possible). A blog, or news section works well for this idea.
• Don’t fall too in love with the overall design and look of your site. Be willing to make changes, and reorganize and restructure how the site works.
• Consider keywords that don’t have corresponding pages. Perhaps pages need to be created to fit that missing piece.
• Most importantly, continue to develop your site like you would your business.
In the end, having a website that people want to find makes SEO that much better and faster. Working with your SEO service provider to make sure that you really are the best site out there will do wonders, and not to mention make a lot of people happy–including your SEO firm
read more...

Friday, September 18, 2009

choosing the best keywords

Ah, the beauty of search marketing! What other marketing medium lets you get your ad in front of your potential customer at the precise moment they are looking for exactly what you sell? Of course, your success with search marketing hinges on your selection of the right keywords. Good keyword selection starts with a brainstorming session. Get together with your team and make a list of all the keywords and phrases that people might be searching for to find a business like yours. Try to include terms that people outside your industry would use. Take a look at your competitors’ websites to see what keywords they are targeting. Review their meta keywords, titles, and content on their websites to identify additional keywords to add to your list. Another great place to look for keywords is forums, blogs, industry sites, and Q&A sites like Yahoo Answers. What are the words people use to find the product or service you offer? Once you’ve built your list, use online keyword tools to see how often your keywords are searched on and refine your list. A few of these tools are Keyword Discovery, WordTracker, Google’s Keyword Tool, and Yahoo/Overture Keyword Inventory Tool. These tools will allow you to see how often your keywords are searched on and give you ideas for other keywords to consider.
It’s important to target keywords that are relevant to your business, but don’t sacrifice relevance for search popularity–I mean don’t forget about your most targeted keywords just because they aren’t as heavily searched. You want to use keywords that will drive traffic to your site AND increase sales. Once you identify the keywords that are most relevant to your website, you can use that list for search engine optimization, pay-per-click (PPC) keyword ads, and any other keyword advertising. After you start getting traffic for these keywords, spend some time reviewing your analytics reports. Monitor the keyword conversion report to see which keywords are generating sales. If you find certain keywords perform better than others, shift your focus to those keywords to achieve the best possible results. Over time you will fine-tune your keyword strategy to the point where you know exactly which keywords perform best and you will enjoy a steady flow of new business from the search engines.
read more...

Saturday, September 12, 2009

freshest summer blogging tips

It’s summertime and if you’re like most people, you didn’t include your blog in your spring cleaning rotation. Just as artists and musicians go through different phases and creative change-ups, bloggers need to “clean house” and evolve with the seasons as a way to improve, fine-tune and stay relevant.
These blogging ideas are hot out of the oven, right off the conveyor belt, organically grown and sealed air-tight to preserve freshness. These fresh ideas are guaranteed to win new readers and increase the interaction and engagement levels of those you already have.
Web Development
1. Change the background color(s) on your site. It’s like rearranging your furniture.
2. Rework (or create) your logo. Make it something that will scale, big or small (something you can turn into a favicon or print on a t-shirt).
3. Ditch the long URL by upgrading your Wordpress- or Blogspot-hosted sites to your own hosting (the mark of a serious blogger).
4. Commit to implement, learn and use Google Analytics. Digging into your site’s metrics is the best way to improve your design and your writing. It empowers you to give back to your readers.
5. Add social bookmarking buttons and an RSS button to your site and posts (Wordpress plug-ins make this a piece of chocolate cake).
6. Make your site “open” by allowing readers to post comments without registering (registration can be a barrier to participation).
7. Cross-pollinate your blog with your Twitter account (which implies that you should have a Twitter account). Once again, this is cake-easy with Wordpress plugins.
8. Learn HTML and CSS so you can make the kinds of minor tweaks that will sharpen your design.
Content
9. Revamp your writing ambience. Write outside, write with a huge grin on your face until you finish, or write with a lemon wedge in your mouth. (It may even shorten your writing time).
10. Develop theme days to create some reader-reward attachment. SEOmoz does “Whiteboard Friday.” Maybe you do “Rainy Day Reviews” when it rains.
11. Create something offline and share it online, be it photography, a cooking experiment, a poem, or a blind contour drawing. Do it as a planned post rather than an afterthought.
12. Go the extra mile for a post to show your dedication. For example, instead of just raving about something, make a Facebook fan page and pitch it to your readers. (As an example, see a page I created about my passion for retro stripes).
13. Look for opportunities to interview a professional as part of a blog post. It’s more than most bloggers are willing to do for their readers.
14. Do way too much research for a blog post so that you become the comprehensive, exhaustive “blogosphere” reference on the topic.
15. Shorten your blog titles. Make them short enough to get mileage on Twitter, i.e., someone can retweet and include the blog post, host blog, short URL to the post, and a personal endorsement–all in under 140 characters. Format: “RT @scottcowley 25 Freshest Summer Blogging Tips | SEO.com http://bit.ly/abc123 – Nice post.”
16. Use a story or example in every post. A commonality of buzz-less posts is that while they may be true and helpful, they are unmemorable and have not pushed any “reader engagement buttons,” which is what stories and examples do.
17. Devote a paragraph to a particular niche of people you find interesting. SEO guru Jeff once highlighted librarians in just a portion of his post about new search engines and random librarian news sites started linking to it, sending hundreds of visitors his way. You’ve got to give credit to those types of plugged-in groups.
18. Write amazing titles. In the Twitter and social bookmarking world, you are judged by your headlines and not by your content. Use the words “fresh, new, hot, or end-all” (5 points if you can do all of them in one title). The great marketing secret is that one man’s “old and boring” is another man’s “fresh, new, hot, or end-all” so don’t be self-conscious about making such claims.
19. Create a video tutorial. Make sure it exudes the “you” brand like your writing does. For example, my wife’s Masters research explores new media’s place in the writing classroom so she created a video called “What is New Media?” Low budget, but effectively “academic.”
20. Highlight current events. Hot topics are the sharpest hooks. As proof, one of the most read SEO.com blogs ever was actually a review for a new Android phone that our CEO bought. If you have any connection whatsoever to a current trend or event, jump on it before it stales.
Aftermath
21. Commit to marketing your own posts. Don’t be a purist. You wrote it. If it was good enough to post, it is good enough to market, share, Tweet, Digg, Stumble, etc. Self-promotion as a blogger will give you valuable experience.
22. Be a blogosphere/Twittersphere octopus. If you mention someone, someone’s blog or post, site, company, or anything with a figurehead web presence, use a brief blog comment or tweet to let them know that you did. People love recognition, and goodwill spreads like the pox.
23. Try responding directly to all comments made about your post. You may be surprised at how easy it is and how responsive your readers will be at this newfound interaction.
24. Use targeted requests for comments on your post. After you post, make a list of the people who would be most interested in the topic and send them a Twitter DM asking for their comments. Last week, Jacob Brown used this great technique to get my feedback on his post.
25. Use blogging as ongoing experimentation. Take detailed notes and track the metrics surrounding your blog posts. Figure out what works for your blog and readership and revisit successful approaches as a means to test and refine your best practices.
Have you ever done a major clean-up of your blogging? Did you try anything with amazing results? Any failures you’d share with others
read more...

Friday, September 11, 2009

How to choose the right keywords

Choosing keywords that are baikMemilih keywords that work focused on what people search is not about what the product, article or your content.
For example, people would prefer œharga € â € hotelâ room?? than œHotel Mutiaraâ â € €??, so use the name of the company less in terms of what people search for, from the point of search enggine certainly no problem because both kewyords will be index by google. So once again, focus on what is generally sought.
To specify keywords and find out what people find most appealing, you may not get the right keywords if you do it by thinking hard or over coffee. So a better and more quickly and accurately if you use an existing tool, there is a link under the keywords tool you can use:

inventory.overture.com
Overture facility you can enter a keyword you will be given the number of keyword searches in the last months. You will also be given an alternative list of relevant keywords and the number of searches for each of these alternatives.

Wordtracker.com
This tool has the advantage because it already has a thesaurus, so you will be given keywords commonly used by people in search for things related to keywords you provide. With this you can choose a more appropriate keywords from the keywords you enter.

adwords.google.com
A tool from Google that actually made to advertisers on google adwords. But the toll is very useful to get the best keyword choices, you can also peek of how competitive the keywords that advertisers use. In addition, ajax-based tool that is very easy to use to select keywords and you can export directly to the excel file.

Complete tool list for keywords you can see in the tool category.
read more...

Wednesday, September 9, 2009

link building strategies for new website

Believe it or not, all SEO’s were newbies at one time or another. Launching a website can be a very exciting event for an individual or new business owner. However, that excitement can quickly turn to disappointment when the site owner finds out that they are getting little or no traffic. If they are getting traffic, it is most likely coming from family members or friends who they notified through an email or Facebook.
Being involved in SEO and marketing in general, I am usually bombarded by family members and friends with questions about marketing their website or their future website on the Internet. Because of this and my willingness to help just about anyone I know, you can usually find me in a small dark corner at a family party (it has to be small and dark because my wife gets ticked when I’m not up and socializing with everyone), on the phone in the car, or answering an email, Facebook message, or Twitter message about SEO and other internet marketing tactics.
I was on a call yesterday with one of my best friends from high school. Our conversation was focused on general search engine optimization principles and link building tactics that could provide a good foundation and hopefully, if he does them right, some strong rankings in the search engines. One thing I emphasized is the fact that what ever he does, he needs to build links naturally. A natural link building campaign is crucial for success in SEO.
A natural link building campaign is one that is just that, natural! Getting 50,000 links within the first twenty days of your website’s existence is definitely not natural. Getting a slow trickle of links coming into your website and then building up looks much more natural. Also, getting 50,000 links pointing to your home page with the same anchor text is not natural. Vary your anchor text and include long tail versions of your keywords. You should also build links to other pages of your site in addition to your home page. All of these things help with building a natural link campaign.
In terms of links, there are two types of links you can get for your websites, external and internal links. Both are very important and can make a huge difference in your search engine rankings. I want to discuss in detail, ten ways to effectively jump-start your link building campaign.
Friends & Family Members
When beginning a link building campaign, one very natural way of obtaining links is from friends and family members. I just opened up our family blog and counted all of the friends and family who we have added in our blogroll. The grand total… 41. How hard is it to simply call up (recommended – it’s more personal) or email your family and friends who have blogs or other sites, and ask them for a link to your new website? Not hard at all! In fact, because they are your friends or family members, they will probably do it without hesitation. You can do the same thing with Facebook, MySpace, Twitter, etc.
Add A Blog To Your Domain
Google loves blogs! Google loves fresh content! Google loves internal links! Internal links from other pages of your website are a guaranteed way to help you increase your rankings in the search engines. Adding a blog to your domain is a great way for you to easily add new content to your website on a regular basis. In most cases, adding a blog to your main domain is rather simple and can be done in as little as three clicks of your mouse. Web hosting companies like myhosting.com, Hostmonster, & Bluehost offer one click installations of blog platforms like Wordpress or b2evolution.
Adding a blog, posting to it at least once daily, and linking back to your home page and other important pages of your website with keyword anchor text is a great way to gain a lot of internal links. When blogging, you should also link out to other websites that interest you and websites that are in similar industries. You might also consider adding images, videos, polls, etc. Mix things up! Make it look natural! Most important, have fun!
Local Organizations
Just about every city in the country has a local chamber of commerce; mine would be the Lehi Area Chamber of Commerce. Joining a chamber of commerce will not only get you a very good link back to your website, but also provides networking opportunities with other local businesses. Most times, depending on your business, you can generate new leads rather easily by getting to know other individuals and businesses in your area.
Local News
Developing relationships with local newspaper writers and always making yourself available for comments on news stories that involve your industry is a great way to build awareness about your company and get links back to your website (if the newspaper is published online).
Help A Non Profit Organization
You would be surprised how many non-profit organizations are operating in your city and your state. Every one of us has been given talents and certain things that we are good at. I guarantee that there is a non-profit organization that could use your help, whether it is with designing their website or painting the conference room of their new office building. If you offer of yourself and your services freely, you can most likely ask for, and get a link to your website from theirs.
Submit Your Site To Local & Industry Specific Online Business Directories
Anyone studying SEO can usually find a blog post about submitting your website to directories on the Internet. Yes, this is a valuable link building tactic, but before you go crazy and start submitting to thousands of directories, seek out local directories and directories that are specific to your business. Submit to these first and take your time filling out all of the information that they ask for. These will be some of your most valuable links since they are so relevant to your website and business.
Social Bookmarking
You have probably heard the term social bookmarking. You have probably heard that social bookmarking is a great way to build links. Well, it is and it isn’t… You can waste a lot of time social bookmarking if you are submitting to the wrong sites. I limit my social bookmarking to Mixx, Propeller and sometimes Kirtsy. The trick to social bookmarking is to not only bookmark your website, your blog posts, and other things related to your website, but bookmarking lots of different things that interest you. This will make your bookmarking profiles look much more natural to both viewers and the search engines.
Write An Article And Submit To Article Directories
Writing articles and submitting them to sites like Ezine Articles, Article City, and Go Articles, also known as article marketing, is a great way to get links back to your website. Take time to write a very detailed article about your industry and submit it to a few article directories like the ones I listed above. You will get links back to your website by properly using the author resource or bio box at the end of the article. You should tell a little about yourself and your company, while adding keyword rich anchor text links pointing back to your website.
Write And Submit A Press Release
Writing press releases and submitting them to places like PR Web or Web Wire is a great way to generate interest and buzz about your business and also to get links back to your website. This method of link building is a little more difficult than other links that you can get for your website. Writing a press release takes skill… a skill, which the average person, like me, does not have. Press releases have certain requirements that must be met, a certain format that must be followed, and in most cases, need to be super interesting. Not having any of these elements can almost guarantee that your press release will either be outright rejected or not distributed to other news related websites.
Build A Hub Page Or Squidoo Lens
Building a Hub Page or Squidoo Lens is a fun way to get links back to your website. To date, I have built 73 Hubs and 25 Lenses for my own personal websites and hundreds for clients that I have managed. The most effective Hub Pages and Squidoo Lenses are those that have at least 450 words of text about a certain topic or subject, videos, pictures, polls, and other gadgets that are easy to add. You want to make the page as interactive as possible so it provides value for anyone who happens to read it. You are allowed two links to other websites from your Hubs and a handful of links (be conservative, don’t spam) from your Lenses.
Building links isn’t pretty. But, in order to rank well in the search engines, it is absolutely necessary. These are ten excellent ways for new companies or new website owners to start building links to their websites. There are many others and I encourage any of our readers to add to this list by commenting on this post.
read more...

Sunday, September 6, 2009

killer robots

If you haven’t heard of Mr. Robots, don’t blame yourself. It wasn’t even on the SEO map till just a couple years ago. Most of you, however, know what it is but don’t know exactly how to dominate the robots.
Robots.txt files are no secret. You can spy on literally anyone’s robots file by simply typing “www.domain.com/robots.txt.” The robots.txt should always and only be in the root of the domain and EVERY website should have one, even if it’s generic and I’ll tell you why.
There’s mixed communication about the robots. Use it. Don’t use it. Use meta-robots. You could have also heard advice to abandon the robots.txt all together. Who is right?
Here’s the secret sauce. Check it out.
First things first, understand that the robots.txt file was not designed for human usage. It was designed to command search ‘bots’ about how exactly they can behave on your site. It sets parameters that the bots have to obey and mandates what information they can and cannot access.
This is critical for your sites SEO success. You don’t want the bots looking through your dirty closets, so to speak.
What is a Robots.txt File?
The robots.txt is nothing more than a simple text file that should always sit in the root directory of your site. Once you understand the proper formats it’s a piece of cake to create. This system is called the Robots Exclusion Standard.
Always be sure to create the file in a basic text editor like Notepad or TextEdit and NOT in an HTML editor like Dreamweaver or FrontPage. That’s critically important. The robots.txt is NOT an html file and is not even remotely close to any web language. It has its own format that is completely different than any other language out there. Lucky for us, it’s extremely simple once you know how to use it.
Robots.txt Breakdown
The robots file is simple. It consists of two main directives: User-agent and Disallow.
User Agent
Every item in the robots.txt file is specified by what is called a ‘user agent.’ The user agent line specifies the robot that the command refers to.
Example:
User-agent: googlebot
On the user agent line you can also use what is called a ‘wildcard character’ that specifies ALL robots at once.
Example:
User-agent: *
If you don’t know what the user agent names are, you can easily find these in your own site logs by checking for requests to the robots.txt file. The cool thing is that most major search engines have names for their spiders. Like pet names. I’m not kidding. Slurp.
Here some major bots:
Googlebot
Yahoo! Slurp
MSNbot
Teoma
Mediapartners-Google (Google AdSense Robot)
Xenu Link Sleuth
Disallow
The second most important part of your robots.txt file is the ‘disallow’ directive line which is usually written right below the user agent. Remember, just because the disallow directive is present does not mean that the specified bots are completely disallowed from your site, you can pick and choose what they can and can’t index or download.
The disallow directives can specify files and directories.
For example, if you want to instruct ALL spiders to not download your privacy policy, you would enter:
User-agent: *
Disallow: privacy.html
You can also specify entire directories with a directive like this:
User-agent: *
Disallow: /cgi-bin/
Again, if you only want a certain bot to be disallowed from a file or directory, put its name in place of the *.
This will block spiders from your cgi-bin directory.
Super Ninja Robots.txt Trick
Security is a huge issue online. Naturally, some webmasters are nervous about listing the directories that they want to keep private thinking that they’ll be handing the hackers and black-hat-ness-doers a roadmap to their most secret stuff.
But we’re smarter than that aren’t we?
Here’s what you do: If the directory you want to exclude or block is “secret” all you need to do is abbreviate it and add an asterisk to the end. You’ll want to make sure that the abbreviation is unique. You can name the directory you want protected ‘/secretsizzlesauce/’ and you’ll just add this line to your robots.txt:
User-agent: *
Disallow: /sec*
Problem solved.
This directive will disallow spiders from indexing directories that begin with “sec.” You’ll want to double check your directory structure to make sure you won’t be disallowing any other directories that you wouldn’t want disallowed. For example, this directive would disallow the directory “secondary” if you had that directory on your server.
To make things easier, just as the user agent directive, there is a similar wildcard command for the disallow directive. If you were to disallow /tos then by default it will disallow files with ‘tos‘ such as a tos.html as well as any file inside the /tos directory such as /tos/terms.html.
Important Tactics For Robot Domination
• Always place your robots in the root directory of your site so that it can be accessed like this: www.yourdomain.com/robots.txt
• If you leave the disallow line blank, it indicates that ALL files may be retrieved.
• User-agent:*
Disallow:
• You can add as many disallow directives to a single user agent as you need to but all user agents must have a disallow directive whether the directive disallows or not.
• To be SEO kosher, at least one disallow line must be present for every user agent directive. You don’t want the bots to misread your stuff, so be sure and get it right. If you don’t get the format right they may just ignore the entire file and that is not cool. Most people who have their stuff indexed when they don’t want it to be indexed have syntax errors in their robots.
• Use the Analyze Robots.txt tool in your Google Webmaster Account to make sure you set up your robots file correctly.
• An empty robots is the exact same as not having one at all. So, if nothing else, use at least the basic directive to allow the entire site.
• How to add comments to a robots? To add comments into your robots, all you need to do is throw a # in front and that entire line will be ignored. DO NOT put comments on the end of a directive line. That is bad form and some bots may not read it correctly.
• What stuff do you want to disallow in your robots?
o Any folder that you don’t want the public eye to find or those that aren’t password protected that should be.
o Printer friendly versions of pages (mostly to avoid the duplicate content filter).
o Image directory to protect them from leeches and to make your content more spiderable.
o CGI-BIN which houses some of the programming code on your site.
o Find bots in your site logs that are sucking up bandwidth and not returning any value
Killer Robot Tactics
• This set up allows the bots to visit everything on your site and sometimes on your server, so use carefully. The * specifies ALL robots and the open disallow directive applies no restrictions to ANY bot.
User-agent: *
Disallow:
• This set up prevents your entire site from being indexed or downloaded. In theory, this will keep ALL bots out.
User-agent: *
Disallow: /
• This set up keeps out just one bot. In this case, we’re denying the heck out of Ask’s bot, Teoma.
User-agent: Teoma
Disallow: /
• This set up keeps ALL bots out of your cgi-bin and your image directory:
User-agent: *
Disallow: /cgi-bin/
Disallow: /images/
• If you want to disallow Google from indexing your images in their image search engine but allow all other bots, do this:
User-agent: Googlebot-Image
Disallow: /images/
• If you create a page that is perfect for Yahoo!, but you don’t want Google to see it:
User-Agent: Googlebot
Disallow: /yahoo-page.html
#don’t use user agents or robots.txt for cloaking. That’s SEO suicide.

If You Don’t Use a Robots.txt File…
A well written robots.txt file helps your site get indexed up to 15% deeper for most sites. It also allows you to control your content so that your site’s SEO footprint is clean and indexable and literal fodder for search engines. That, is worth the effort.
Everyone should have and employ a solid robots.txt file. It is critical to the long term success of your site.
Get it done.
Bling.
read more...

Tuesday, September 1, 2009

duplicate content

Just like the lamb lying down with lion, but the big three search engines came together for a rare joint effort to announce the launch of a new feature that will help ease the world’s duplicate content problems. The new feature, called the tag tells the search engines what version of the URL is the correct one to want index.
The link tag goes in the section of the page of looks like this:

The search engines have been doing a pretty good job figuring out the right URL when people use redirects properly, but it’s not always easy to get it right–especially when we build these nutty sites with 50 different URLs pulling up the same content. This change should help a lot of webmasters sleep easier knowing that their proper, canonical URL will be indexed and given all the link juice it deserves. I look at it like a page by page sitemap to tell the search engines what URL to index.
Official blog posts about the new link / canonical tag:
Yahoo
Google
Microsoft
Coverage on other blogs:
Search Engine Land
Joost already created a WordPress plugin for this feature
3 Reasons to Use rel=canonical, 4 Reasons not to use it
read more...
 
seo Copyright © 2009 Blogger Template Designed by Bie Blogger Template