Saturday, April 4, 2009

get-down-dirty-website-code

According to Google their mission is to organize the world’s information and make it universally accessible and useful. I might add that they are also striving to provide the best user experience possible.
One aspect of SEO that is either looked over or forgotten is the technical side. I am not referring to the optimization of website code for search engine purposes. I am talking about major technical issues causing Google to rank your site lower because it detracts from the ultimate user experience.
There are three technical aspects that I would like to discuss in my post today. I must stress that I am not sure how much these aspects help; I just know they do. More than anything, I am hoping that my post will spur discussion to help us better understand these aspects and how they affect SEO.
1. Clean Code
It’s no secret that the fanciest websites can have very messy code. When search engine spiders visit your website, they only look at the code. They honestly couldn’t care less how the site appears to your visitors. It is very important your website code is clean, so spiders don’t have to sort through a bunch of crap to get to what they are really looking for… your content!
Here are the basics…
• JavaScript should be in its own file and called to on the actual web page.
• If you are hosting video (this does not include any imbedded YouTube videos), it should be in a separate file and called to on the actual web page.
• All formatting should be put in a cascading style sheet (css).
2. Downtime/Timing Out
How annoying is it when you visit a website and it is down? How about when you visit an hour later and it is still down? Website downtime is one of the things that really puts a damper on user experience.
Unlike clean code or page load speed, downtime is something that might be happening without the webmaster knowing. One of my clients, for privacy purposes we will call them Green Tree Landscaping Inc. (Thanks, landscape guy outside my window), has a website with a majority of their pages timing out. When I visit their site and those pages, everything seems to be working fine. Whenever they visit their site, it is up. No one knew the site was having issues until we ran a report in Xenu. The Xenu report indicated the pages were either timing out or there was a connection error.
3. Page Load Speed
Google Webmaster Central released a blog post on the 4th of June regarding page load speed. It is obvious from this post that Google is becoming more concerned with page load speed and how it affects the user experience. Although they don’t directly state it, I believe it is becoming more of an SEO and ranking factor than it used to be. Fixing your page load speed was previously a daunting task. Until recently, you had to practically guess what was causing a page to load slowly. On the 4th, Google released what they call Page Speed. This is a cool little Firefox Add-on that integrates with Firebug. When you run Page Speed, you receive instant suggestions on how to tweak your web pages to improve load time. They also provide educational information to help you learn (if you don’t already know) how to make the changes.
If you ask me, this is a great resource (possibly, one of the best) provided by Google. It will really help webmasters improve their site’s performance. Just remember, it’s all about creating the best user experience possible.
SEO changes on a regular basis. What may have worked yesterday in achieving top search engine rankings, may not work today or tomorrow. I believe these three aspects will play an even bigger role in determining search engine rankings in the future.
Digg Google Bookmarks reddit Mixx StumbleUpon Technorati Yahoo! Buzz DesignFloat Delicious BlinkList Furl

0 comments: on "get-down-dirty-website-code"

 
seo Copyright © 2009 Blogger Template Designed by Bie Blogger Template