Header Ads


How to SEO Strategies

How To SEO Sttegies



Introduction

The first goal of any search engine optimization strategy is to get your web pages indexed. But even before that can happen, you need to get the search engine crawlers to visit your website.Depending on the search engine or directory and the overall circumstances (how you invite and solicit crawlers), that first visit could take days, weeks, or even months.And while it’s true that the initial crawler visits can be somewhat unpredictable (or take a long time in coming), once the ice is broken, future visits can be controlled to some degree…Basically, the more frequently you update your pages, the more frequently the crawlers will show up on your website doorstep.Of course, that’s only half the battle. The other half is getting the search engines and directories to actually index your pages.In order to do that, you need to start at the beginning. 

And the beginning in this particular instance is developing and enhancing pages in such a way that the search engine crawlers will be impressed.The overall search process is simple…All the text content that search engine crawlers gather is stored and indexed. People conduct searches based on certain phrases (keywords). Whatever content possesses the most relevancy with regard to any given keyword will be placed in the top positions of the search results.

And since the title of the page and the text content generally carry the most weight - at least with regard to what search engine crawlers deem most relevant during their visits - it stands to reason that improvement in page rank and/or search results listing can most often be attributed to having individual and specific keywords properly incorporated into those two prime areas.

Of course, if keywords were the only basis for which page rank and position in search results were determined, optimizing web pages would be pretty much cut and dried…pick a keyword > use it in your title and throughout your content >achieve high page rank and top position in search engine results.

The problem is, there are so many variables that not only come into play but change on a regular basis, it can seem as though achieving solid and effective search engine optimization might never be possible.Fortunately, it’s not only possible, it can be relatively painless as well. All you have to do is satisfy the top three requirements of pretty much all major search engines…

provide quality content

update content on a regular basis get numerous top-ranking websites to link back to your site.And the search engines and directories you should be trying to impress the most are the top four contenders…

Google

http://www.google.com

Yahoo

http://www.yahoo.com

MSN Search

http://search.msn.com

DMOZ (Open Directory)

http://dmoz.org

Beyond that, there are countless other search engines and directories like AltaVista, Ask Jeeves, and AllTheWeb.Should you optimize for those as well, or simply level your sites on the major players and bypass all the search engines and directories below them? Not necessarily. You still want your pages listed in as many locations as possible. You just shouldn’t try to satisfy every one of them with regard to optimization.Satisfy the top four contenders. 

Then, if you have the time and ambition to broaden the scope of your SEO efforts, do it. If not, don’t worry about the hundreds (or even thousands) of other search engines and directories that exist.You’re only human. And just meeting the optimization criteria of the top four is going to be challenging and energetic enough.Of course, unless you plan to make search engine optimization your life‘s work, it’s not likely you’ll invest most of your energy in that one single area (even when restricted to the top four players). 

But you do need to invest a fair amount of quality effort.


And that basically equates to these two missions…

1. Get your pages indexed by major search engines.

2. Improve your page rank and position in search results.

In order to accomplish both of those, you need to carefully balance the line between good optimization techniques and the urge to take things a bit too far.In other words, you need to make certain you carry out your two missions without stepping over the line into what’s commonly referred to as “black hat” search engine tactics.

That dark and evil territory would include things like…Keyword Stuffing - repeating keywords over and over again for no logical or practical reasonHidden Content - including keywords or text that’s the same color as the background for the purpose of manipulating search engine crawlers.Doorway Pages - not intended for viewers to see but rather to trick search engines into placing the website into a higher index position

Although these types of practices were once considered intelligent and effective methods of optimization, they can now result in having your website banned from search engines entirely.In general, it’s better to concentrate on the most popular and most reasonable optimization techniques. By doing that, you’ll not only achieve the results you’re looking for, your efforts will have long lasting results.

And when you consider how much work is involved in getting any website to the top of search engine rank and position, it’s worth whatever effort it takes to get it right the first time.

Search Engine Strategy Basics

For the most part, there are three basic things you’ll need to do in order to accomplish proper and effective search engine optimization.

compile keyword lists

publish keyword-rich content

establish a beneficial link strategy

Naturally, having software that can help you accomplish those four things quickly and efficiently would be a great asset. So, in addition to exploring each of these areas, we’ll include the best software programs for making each task easier to perform.

Keywords

The core of any SEO strategy is built almost entirely around the group of keywords you choose to target.The first order of business is to decide which groups of keywords you’ll be utilizing. In most instances, those groups will be either directly or indirectly related to the topic or niche that your website is (or will be) associated with.Once you’ve established the individual groups of keywords you want to target, you can begin to compile a comprehensive list of top-level phrases that have each of the following characteristics:

are searched for by thousands of viewers each and every month.have little or no competition associated with it.The more people who search for the term combined with the least amount of competition associated with it, the more valuable the keyword will be with regard to gaining automatic search engine traffic.Beyond that, you’ll want to compile lists of secondary keywords. These would still be valuable, but not to the extent that the first top-level list would be.The main advantage of lower level keywords is the fact that you don’t have to work quite as hard to get definitive search engine recognition. And since you’ll automatically get fairly decent results position, you’ll also receive additional targeted viewer traffic.

To make up for the lack of quality in the keyword itself (in most cases that equates to fewer searches being conducted every month and therefore less competition), you need to work with a much larger quantity of lower-level keywords.Basically, the results will be just as good as what you experience through top-level keywords. It will just take more keywords to achieve those same results.Of course, the good news is that there are software programs which can significantly cut down the amount of time it takes to gain content - no matter how many keywords you decide to target (see the next segment on Quality Content).

There are several ways in which you can compile keyword lists. One of the quickest and easiest methods is to use the free online suggestion tool that’s.Although it will give you a clear indication of how many searches have been performed on any given topic during the previous month, it’s somewhat bare-bones.

 Plus, there’s no way to easily transfer results from their web page to your independently compiled keyword list.When you copy and paste the Overture results, you also get the number of times each keyword has been searched. While that might be good for research purposes, you’ll have to manually remove that part of the data in order to wind up with a file that only lists keywords.

Wordtracker at http://www.wordtracker.com, on the other hand, does in fact allow you to save your results with nothing but the keywords listed.You’ll have to pay to use their online service, but it’s well worth the price. It’s highly effective and offers the most in-depth and accurate capability with regard to real searches that people perform.Although there are numerous ways you can conduct research using Wordtracker, they will all revolve around the ability to compile keyword lists which are based on the groups of keywords you originally established.Once you know exactly which keywords you’ll be targeting, you can begin to implement content that will be associated with each of those phrases.

Quality Content

There are numerous reasons why “Content Is King”.

From a viewer’s perspective, content not only invites them to visit your website but encourages them to return on a regular basis.It’s a relatively simple equation…

They’re looking for valuable information. Give it to them.From a search engine perspective, content is one of the primary factors in determining just how much weight or importance should be given to any web page.Unfortunately, this one isn’t quite as simple an equation…Search engine crawlers gather and index content. Figure out how to make them place your content higher on the results ladder than some other website.Of course, in order to become King, content needs to be of considerable quality. In order to remain King, content needs to be updated on a fairly regular basis.

Not to mention the fact that you also need to add content (new pages) on a regular basis. If not, whatever ground you initially gain will simply fade away. And so will whatever search position or rank you’ve achieved.Naturally, you can manually add content by writing everything yourself. But that alone would be far too time-consuming. Especially when you consider all the other webmaster tasks that need your attention.

So let’s talk about automating the task instead…

One of the best methods for gaining quality content is to include keyword-rich articles on your website.And rather than take the time to write them yourself, you can simply search for and accumulate articles that others have written.Of course, doing that can also eat up a great deal of time. To minimize the task - as well as enhance the results - you can simply use the following software program.

Article Equalizer

http://www.articleequalizer.com

This allows you to accumulate up to one thousand articles with just the click of a button. And, you can do it based on a specific topic or keyword.Use the articles to add quality content to your existing websites or use them to built entirely new niche sites. Either way, this is one of the fastest and most efficient methods of gathering quality content.

RSS feeds are yet another superior method. Not just for gaining content but keeping it fresh and updated as well.Depending on what feed or feeds you happen to choose, the content can be as simple as a list of topic-related links or as complete as a full-scale, full-page article. And of course, there’s everything in between.The most common choice for RSS feeds are the ones that display a list of topic-related URL’s with a brief description beneath each one. The reason this type of feed is most popular is the fact that the brief description allows more potential for targeting specific keywords.For example, if the topic of your website is golf and you want one of your pages to be optimized for the keyword “golf swing”, you would want any and all content to include that particular search phrase.

It’s no different than optimizing any other content on your website. 

You have a specific keyword and you need that phrase to be included in such a way that it will carry significant weight with the search engine crawlers.If you can’t accomplish that, you’re merely shooting in the dark, hoping to gain targeted viewer traffic without actually targeting it.The goal is to add content that is geared toward specific keywords. And RSS feeds are no different than any other content. 

If it doesn’t include the keywords, you’ll merely get search engine credit for having generic topic-related content.Of course, what you really want - and need - is to gain rank and listing benefit from whatever content is added. That’s the whole purpose… to gain enough search engine recognition which in turn gains you targeted viewer traffic.That being the case, the ultimate software program would be one that could automatically place RSS feeds on your pages while at the same time do it based on specific keywords.

Fortunately, there is such a program. And it’s the best software available…

RSS Equalizer

http://www.rssequalizer.com

What you achieve by using RSS Equalizer is instant theme-based content, the kind that search engines like Google are looking for. And because the content changes each and every day, you can count on receiving more frequent visits from search engine crawlers.So the ultimate result is just what you’re hoping to gain… faster indexing, better search position, and higher page rank.

Linking Strategies

Choosing the right keywords and publishing quality keyword-rich content puts you approximately two-thirds of the way toward optimum search engine recognition. 

The other third is pretty much solely based on popularity.

If we were talking about popularity in the real world, it would probably include simple things like who was voted King and Queen of the high school prom, or who had the most date options on a Saturday night, or which sibling got the most attention from Mom or Dad.In the world of search engines, popularity takes on a whole different meaning. 

And in most instances, it comes down to this… the website with the most quality links pointing to it wins the contest.

Link popularity.

That’s the game. And the ultimate goal is to get countless “important” websites (those that have a theme or topic that’s similar to yours) to provide links back to you. Of course, when we’re talking about importance, we’re referring to how major search engines view them.Most often, that equates to high page rank and top position in search results.

 The higher up the food chain a website happens to be, the more powerful any link they provide back to you is perceived.In order to get the most bang out of the link popularity process, it’s best if you actually seek out valuable websites. Aside from those you might already have in mind, conduct searches based on the keywords you’re most interested in gaining search engine recognition for.

Naturally, someone who’s in direct competition with you wouldn’t even consider giving you a link back. So what you’ve really looking for are popular websites that have content or products that are either complimentary to yours or are indirectly.For example, let’s say your topic and keyword is based on ways of perfecting your golf swing. 

Good link back choices would be websites with the following themes or products:i

nformation about golf courses or golf tournaments.golf equipment or apparel

golf instructors or seminars.If the topic is related to yours and the website that’s providing the link back carries a good deal of weight with major search engines, the value of your own website will automatically be elevated.When it comes to the actual link that these valuable and important websites place on their pages…

Always encourage the use of text links rather than just a URL. For example, instead of simply displaying http://www.adwordanalyzer.com as the link back to your website, you want something more substantial and keyword rich. And, of course, search engine friendly.If one of your keywords is “targeted traffic”, for example, the link might read as follows:

Drive targeted traffic to your website with Adword Analyzer

That not only gives you credit for the keyword, it encourages the search engine crawler to perceive your website as having more value.If you have a separate page on your website where you solicit link backs, it’s always a good idea to list one or more link text possibilities.That way, you’ll receive credit for the keywords you yourself have chosen to target.

You should also provide the HTML code for placing your link on other websites. Basically, make it as easy as possible for someone else to add you to their pages.You should also specify where you require a link back to be placed.Ideally, you would want your link located on either a home page or one click away from the home page. At the very least, your link should be located where it will be perceived as valuable by the search engine crawlers.Buried four or five levels deep on some obscure page that might not even be indexed is absolutely worthless. 

The whole point of getting link backs is to gain more importance with the search engines.

So the bottom line is…

The more control you have over the links that others place on their websites, the more search engine value you’ll experience.It takes a good deal of time and effort to encourage high-ranking websites to link back to you. Make certain you invest whatever additional effort is necessary in order to gain the best possible link as well.

And the criteria for the best possible link is this:

1. It includes keyword rich text.

2. It originates from a valuable and high-ranking website.

3. It’s placed in what would be considered an important location.

Anything less than that and you’re compromising the whole link back process.

Always keep in mind that in this particular instance, quality will always win out over quantity. Yes, you want a vast number of links pointing back to your website. But given a choice, you’re much better off with fewer links from important websites than countless links from sites that don’t carry much weight with search engines.

What To Do…

Following is a brief overview of what each of the major search engines and directories is looking for with regard to optimization and value.

Google

Doesn’t use meta description and keyword tags. High score for the overall weight and proximity of keywords, < h > tags, and bold text. Rewards quality content, anywhere between 50 to 600 words. Content should include keywords in text and links. Likes to see keywords in the page title (utilizing 90 characters or less) and carried consistently throughout the website. Especially values link popularity, themes, and keywords in URL‘s and link text. The use of excessive keywords, cloaking, and link farms is viewed as SE spamming.

Yahoo

No major importance but the description and keywords filled in play a role. Will not index  anything associated with SE spam. Slow loading pages run the risk of being excluded. The page title has some significance and should be concise. Likes site popularity and wants to see a theme throughout the website.

MSN

Supports meta description and keyword tags.  Doesn’t index anything associated with SE spam. Frames must use <no frames> tag to get indexed. Considers the page title important and wants it to contain keywords. Wants to see proper keyword frequency. Link popularity carries a good deal of weight. Likes to see a theme carried throughout the entire website.

DMOZ

Likes to see concise and accurate descriptions and keywords. Slow loading pages can be penalized. The page title has some significance and should be filled in. Keyword frequency is not factored in. Link popularity is not important. Especially likes to see accurate and appropriate category choices.


What Not To Do…

After all your hard work getting your web pages optimized, the last thing you want is to do something that would prevent your site from getting indexed. 

Or worse, have it blacklisted by search engines altogether.At the top of the “don’t do” list is the use of invisible text (the text is the same color as the background ). Most every search engine is wise to this practice and will currently ban any website found to be using it.Here is a quick rundown of everything else you should never do…

Don’t repeat keywords excessively.Don’t place irrelevant keywords in the title and meta tags.

Don’t make use of link farms.Don’t submit to inappropriate categories in search directories.Don’t submit too many web pages in one day.

Don’t publish identical pages.

Don’t use meta refresh tags

No matter how good your website is - no matter how valuable the content it contains or how legally optimized it might be - if you use any of the things spelled out above, you run the risk of being blacklisted, branded as a search engine spammer.Although it varies from one search engine to another, spamming can include one or more of the following:irrelevant web page titles and meta description and keywords tags; repetition of keywords; hidden or extremely small text; submitting web pages more than once in a twenty-four-hour span; mirror sites that point to different URL addresses; using meta refresh tags

When it comes to directories such as DMOZ (which have human editors), spamming generally equates to one of these three practices:deliberate choice of an inappropriate category within the directory; marketing language; capitalization of letters

It’s not difficult to stay out of black hat territory. But it’s certainly difficult to recover from having used those types of techniques. That is, assuming you can recover at all.Just pay attention to the rules established by search engines and directories.  And since Google is the player you’ll most want to satisfy, it’s important that you read and re-read their webmaster guidelines which are published at http://www.google.com/webmasters/ on a regular basis.

Break the rules and you’ll always be struggling to gain benefit from all the major search engines. Follow the rules and you’ll establish web pages that will not only be around a long time, they’ll always be in contention for top search results position.


Checklist

Keywords

Start by establishing groups of keywords that are related to your chosen topics or areas of interest.The best keywords are ones that are searched for by thousands of viewers each and every month but have little competition associated with them.Because secondary keywords are associated with fewer searches and less competition, you’ll need to implement more of them in order to achieve maximum benefit.Keywords should be included in the title, in < h > tags, and throughout the overall content.

Wordtracker is a comprehensive and in-depth online service for compiling accurate and effective keyword lists.Don’t repeat keywords excessively.Don’t use inappropriate keywords in the page title and description.


Quality Content

From a viewer’s perspective, content not only invites them to visit your website but encourages them to return on a regular basis.From a search engine perspective, content is one of the primary factors in determining just how much weight or importance should be given to any web page.You need to add new content on a regular basis.You need content that is updated frequently.Use Article Equalizer ( http://www.articleequalizer.com ) to easily and quickly accumulate and publish keyword-rich content.

Use RSS Equalizer ( http://www.rssequalizer.com ) to place keyword-related RSS feeds on specific and individual pages.

Linking Strategy

The goal is to get countless “important” websites to provide links back to you.The higher up the food chain a website happens to be, the more powerful any link they provide back to you is perceived.Actively seek out important websites that have similar or related themes, products, or information.Encourage link backs to include valuable and keyword-rich text rather than simply a URL address.

The best links originate from high-ranking websites, are placed in important page locations, and include keyword-rich text.Pay attention to the rules set forth by search engines and directories, especially the webmaster guidelines published by Google.Follow the rules and guidelines set forth by search engines and directories.There’s no doubt about it. Optimizing pages to satisfy search engines can be a tedious and demanding task. Not just initially, but throughout the duration of any website being live on the web.Basically, your search engine optimization never ends.

You strive for high page rank. That can mean an actual score like the one Google assigns to individual web pages or merely a conceptual rating that provides your website with more search engine recognition and stature than other sites in your area of interest.Either way, the goal is to make your website more popular, more visible, more important than all the competition.You might not reach the top of the heap, but that’s where you have to aim in order to land anywhere near the top.

Not that you can’t reach the very top. You can. It’s just not necessary in order to reap all the benefits - at least, from a strictly search engine perspective.Let’s face it. If you land in the top three positions (or even on the first page) of search results, you’ll most likely capture the same amount of traffic that the number one website enjoys. Maybe even more.It all depends on your description. Or should we say, the description that a search engine displays in your listing - since meta description tags are rarely used anymore.

If your description more closely matches what a viewer is searching for, they’ll go to your website first. Regardless of what results position you happen to be in.And even if they don’t go there first, they’ll most likely get there eventually. Unless, of course, one of the other websites has totally and completely satisfied their needs and they don’t feel compelled to continue their search.The point is, it’s not entirely about what position you gain in search engine results. It’s about targeting a specific keyword (search term) and then making certain you accomplish these two things…

1. Your website ranks high for that keyword.

2. Your website can deliver viewer expectation for that keyword.

Of course, delivering the viewer’s expectation is fairly straightforward.

If the search term is “improve golf swing”, it’s a pretty safe bet the viewer is looking for something to improve their golf swing. As long as you provide information or a product (or both) that can satisfy that need, you’re in excellent striking distance.Covering the first accomplishment - getting a high rank for your website - is a whole lot more involved.It’s not just about satisfying a specific viewer need. Instead, it’s all about convincing a search engine that your website is superior with regard to satisfying a specific viewer need. For example…

There are over two million web pages associated with improving one’s golf swing. Some contain information, some contain products. Some contain nothing more valuable than a brief mention of the search term.Regardless, there are millions of pages that show up in the search results total when a viewer types in “improve golf swing” (approximately 50,000 results if you put quotes around it, which the majority of searchers don‘t include).

All you have to do is dive into that vast ocean of search results and somehow manage to dog-paddle your web page past all the other possibilities and onto the sandy beach. Where only a few top ranked pages are currently basking in the sun.The only question is, how do you accomplish that? How do you wind up in front of all those other web pages?

You start by analyzing each of those top ranked pages. You sift through their source code, their web content, their design techniques. Whatever it takes to find out exactly what they’re doing that placed them in the top results positions.And then you do the same thing. Only better. And you keep doing it until you reach your ultimate goal.That goal might just be the number one position. Or maybe it’s getting listed in the top three. Or maybe you’re willing to settle for any position on the first page of search results.

It doesn’t really matter.

Whatever goal you’ve set, whatever position you’re shooting for, you level your sights on the top ranked web pages and then do everything they’re doing and more.Of course, if you’re targeting a less sought-after search term, you won’t have to work nearly as hard. And that’s why so many savvy webmasters do just that…They deliberately seek out search terms that are valuable to their particular niche, but don’t have nearly the amount of competition associated with them.

That way, simply implementing the basic optimization techniques will most often ensure them a top position in search results for any one of those keywords.Of course, you have to know which optimization techniques work for which search engines or directories. They’re all different. They all set their own criteria for what elements are most important.Some put the greatest emphasis on link popularity. Others place a good deal more value on the count and density of a specific keyword on individual web pages. Still others are more interested in seeing a basic theme or topic carried throughout the entire website.

Fortunately, if you limit your optimization efforts to satisfying the top players - Google, Yahoo, MSN, and Open Directory (DMOZ) - you can cover the most important SEO bases simultaneously.For example, even though having the keyword in your page title might not carry a great deal of weight with Yahoo, it’s an absolute must when it comes to satisfying Google. So put your keyword in the title.Although DMOZ doesn’t care so much about links pointing to your page from other websites, Google, Yahoo, and MSN do place a considerable amount of value in how “popular” your page is.

And all of them want to see a fair amount of quality keyword-rich content and a solid topic or niche theme throughout.By incorporating all of the most important optimization techniques - the ones that are unilaterally perceived as most valuable - you’ll find that you have automatically satisfied the top players.And speaking of top players, Google is the one that you need to aim most of your time and energy toward. And to assist you in that regard, the majority of this particular report contains Google specific information.Concentrate on rising to the top of Google’s results and everything else will naturally fall into place. It’s just that simple.


SEO Strategy - Google Style

Google Webmaster Tools

They’re free and yet very few webmasters take advantage of the tools that Google has made available. And that includes Google Sitemaps, one of the best methods for getting your pages crawled and subsequently indexed (we’ll talk about that one in depth in the next segment).Listed below you’ll find some of the free SEO tools that you should be using on a regular basis.

NOTE: In order to use any of these tools, you’ll need a special key. Just click on “Get a Free Googleä API Key” or go to http://www.google.com/api and submit the form. The key will then be sent to whatever email address you specify.

Google Rankings

http://www.googlerankings.com/index.php

This tool allows you to locate the search results position for any given keyword and URL address. You can input one word at a time or multiple keywords.You also have three choices with regard to where the search will be conducted. That gives you the option of seeing what position is held in one or more of the three major contenders… Google, Yahoo, and MSN.The nice thing about this particular tool - aside from the valuable information it provides - is that fact that it’s relatively fast. Unlike other tools of this type that can take several minutes to complete the search and results process.

Google SEO Tool

http://googlerankings.com/ultimate_seo_tool.php

When it comes to keyword optimization, this tool is an absolute must. There are two steps involved which return information about keyword count, keyword density, and keyword position.

Step 1

Analyze Keywords - Gives you a list of 1, 2, and 3 word phrases that appear “x” amount of times or more on any given page (“x” is the amount you choose when first filling out the form). You also receive the density percentage for each word listed.It will also display the page title, the meta description and keywords tags, and the top five most often used keywords.

Step 2

Create Position Report - Tells you what position the web page holds in Google search results for each of the top five words found in Step 1.

Googlerankings Position Tracking

http://googlerankings.com/positiontracking/

This is an excellent means of staying on top of all your search engine positions. You create a free account and then log in to input whatever URL addresses and keywords you want to keep track of.It allows you to check your ranking history, create charts, or download data to your spreadsheet application.

Google AdWords Keyword Tool

https://adwords.google.com/select/main?cmd=KeywordSandbox

Use this suggestion tool to get ideas for new keywords that can help improve your ad relevance. Enter one or more keywords and Google will show you matching queries and alternatives. Can be very helpful when running AdWords campaigns. 

Google Suggest

http://www.google.com/webhp?complete=1&hl=en

As soon as you start typing in the search query, Google will begin to suggest similar search terms. It will also show you how many results exist for each of those terms. Very helpful when compiling keyword lists or determining niche markets.

Google Sponsored Links

http://www.google.com/sponsoredlinks

Conduct a search in Google that returns only sponsored link results only. This is extremely useful when you’re trying to find the proper wording for your Adwords or need to see how your competition is doing.

Search Term Difficult Checker

http://www.searchguild.com/difficulty/

This one doesn’t happen to be directly from Google but it has such tremendous value, it definitely had to be included here.

All you do is enter your Google API Key and a search term. (If you don’t have an API key, you can get one for free at http://www.google.com/api.)

The program will return a score factor that will let you know how difficult it would be to gain a position on the first page of Google search results for the keyword (search term) you just queried. The lower the score, the easier it will be.

Now, whenever you come up with a keyword you think might have potential, you can find out right away whether or not it‘s even worth investing any time and effort. Both from a traffic generating perspective and an SEO position.

Google Sitemaps

Everyone knows about sitemaps. Traditionally, it’s a separate area where you include links to every public page on your website.Sometimes they include brief descriptions of the different pages and the content they contain. Sometimes they are nothing more than a long and somewhat generic list of page links.Some people create sitemaps with the sole purpose of giving their viewers a comprehensive web page directory.Some people create sitemaps simply to make certain the search engine crawlers find each and every available page on their website.

And then came Google Sitemaps…

Like all search engine crawlers, GoogleBot is out there with the express purpose of gathering valuable data that can be added to its searchable index. The sooner it can return with new and updated information the better. For both Google and the people who use their search engine.

With that in mind, the Google sitemap service offers a twofold solution.First, it lightens GoogleBot’s burden of having to constantly crawl the same places over and over again looking for new and updated content.Now, with a system that tells the bot when and where to crawl, the result is simply a great deal of time being saved. Time that can be spent much more efficiently.Rather than waste time on pages that have not been (and might never be) updated or changed, the bot can zero in on places that have valuable and current content that can be added to the search database.For webmasters, Google Sitemaps offers a way to send immediate notification when any change or addition takes place within their websites. This not only increases the possibility of getting pages indexed faster, it ensures that GoogleBot can easily locate pages that are available and bypass any and all pages that aren’t meant to be public.For the sitemap files themselves, there are two different types that you can implement.

The first one is your typical list of individual pages (just like any other sitemap would display). The second type would be used as an index, listing multiple sitemaps (in the event you have more than one).The limit is 50,000 URLs per sitemap with a maximum of 1,000 sitemaps.Google accepts plain text versions but gives higher priority for sitemaps that are written in XML format. That’s because the XML version includes valuable notification options that can be associated with each URL.

Here is a brief explanation of each of those options.

Last Modified <lastmod>Allows you to specify the exact time and date a page was last changed or updated. This should conform to the ISO 8601 format (your can read these specifications at http://www.w3.org/TR/NOTE-datetime) . If you choose not to include the time, the format for the date alone would be YYYY-MM-DD. March 9, 2006, for example, would be displayed as <lastmod>2006-03-06</lastmod>.Change Frequency <changefreq>Allows you to specify how often a page will change or be updated. Valid values are always, hourly, daily, weekly, monthly, yearly, and never. Be aware, however, that the value is merely used as a guide and not a command. It’s possible that any given page can be crawled more or less frequently than the specified value.

Priority <priority>Allows you to specify a number that tells how important you feel any page is in relation to all the other pages on your website. Valid values range from an absolute low of 0.0 to a maximum high of 1.0 (the default priority value of a page is 0.5).Keep in mind that the priority you set has no bearing with regard to what search engine results position your page achieves (if any). It merely tells GoogleBot which page should be given the most importance when crawling your website.

XML Sitemap Example


<?xml version="1.0" encoding="UTF-8"?>

<urlset xmlns="http://www.google.com/schemas/sitemap/0.84">

  <url>   <loc>http://www.example.com/</loc>   <lastmod>2005-01-01</lastmod> <changefreq>monthly</changefreq> <priority>0.8</priority>

   </url<url> <loc>http://www.example.com/page1.html</loc<changefreq>weekly</changefreq></url><url><loc>http://www.example.com/page2.html</loc>

      <lastmod>2004-12-23</lastmod><changefreq>weekly</changefreq>

   </url><url><loc>http://www.example.com/page3.html</loc><lastmod>2004-12-23T18:00:15+00:00</lastmod><priority>0.3</priority></url><url>

 <loc>http://www.example.com/page4.html</loc><lastmod>2004-11-23</lastmod>

   </url></urlset>

Sitemap Index Example

<?xml version="1.0" encoding="UTF-8"?><sitemapindex xmlns="http://www.google.com/schemas/sitemap/0.84"><sitemap><loc>http://www.example.com/sitemap1.xml.gz</loc><lastmod>2004-10-01T18:23:17+00:00</lastmod></sitemap><sitemap><loc>http://www.example.com/sitemap2.xml.gz</loc><lastmod>2005-01-01</lastmod></sitemap></sitemapindex>

Notice the additional .gz extension. To reduce bandwidth, you have the option of compressing your sitemap files using gzip. Uncompressed sitemap files cannot exceed ten megabytes.Naturally, if you have a relatively small website, managing your sitemap won’t be difficult or overly time consuming. But having a program that automates the process of updating and delivering the sitemap would still be beneficial.Of course, you probably don’t have one small website. You most likely have (or will have at some point) numerous websites with hundreds if not thousands of pages each. And under those circumstances, you an automated system would definitely be an asset.

Sitemap Equalizer ( http://www.sitemapequalizer.com ) is the best program for doing that. Especially if you want to make absolutely certain everything has been taken care of accurately and properly.It provides a powerful web spider that will crawl your entire site beforehand, making certain there are no dead ends or traps where a search engine spider can get stuck in a loop, unable to access all of your pages.

For more information about Google’s sitemap service, check out the following pages of their website…

Google Sitemaps

http://www.google.com/webmasters/sitemaps/

Google Sitemaps Overview

http://www.google.com/webmasters/sitemaps/docs/en/navigation.html

Google Friendly Design

No information about SEO strategy would be complete without mentioning how basic design elements can effect indexing and page rank. And in this instance, what works best for Google basically applies to all search engines.The first thing you need to understand is this…

When it comes to good optimization, the only one that really matters - the only one you need to satisfy - is the search engine crawler.Naturally, nice clean design and proper navigation is important to your viewer. But great website presentation and performance isn’t much good if it doesn’t comply with search engine standards or requirements.Unlike viewers, who can view your website both outside and in, search engine crawlers only get to experience your website from the inside, by following the source code from top to bottom.And they’re on a specific mission… to locate information that will help index any given page. If everything is laid out properly, the crawler will have no problem locating keywords that have been deliberately and properly placed within its path.

That allows the crawler to accurately index your web page. Which, of course, is what you ultimately want. Web pages that are indexed according to the keywords that will provide you with the greatest benefit.If the design is jumbled (or causes the source code to contain a large volume of unnecessary elements), there’s a good chance the crawler will never come up with a viable indexing choice. And since the crawler is always in a hurry, it’s not about to stick around for any additional or extended length of time on your behalf.If, on the other hand, the important information - the keywords you’ve carefully and painstakingly chosen - are located in all the right places and used in the proper context, a crawler won’t have a bit of difficultly determining exactly how that particular page should be indexed.

Primarily, those crawler-friendly locations include places like the page title, clearly visible and high-placement < h > tags, and the first paragraphs and/or sentences of the main text content.Should you ever consider incorporating the most flashy and innovative techniques on your website, think again. Doing so is never going to impress or solicit favor from search engine crawlers. (It probably won’t even impress your human visitors.)Following is a basic list of what most search engine crawlers can’t process (extract information from)…

Image text.Multimedia (such as flash and streaming video)Pages that require login or cookies PDF files XML.Java applets.In addition, most search engine crawlers have a hard time with things like frames and dynamically generated content (for example, URLs that include “?”).If the crawlers can’t navigate your site (and remember, they’re navigating through the source code rather than the outside elements), they can’t properly index your website.Worse case scenario is that they’ll leave prematurely and never wind up fully indexing your website.In order to optimize your pages in such a way that you satisfy both human visitors and search engine crawlers, you need to do the following:

utilize the best keywords for your topic or niche.place keywords where they are most effective and advantageous.use keywords in their proper context.include the correct amount of keywords throughout all locations.As long as you accomplish that, you’ll have a website that’s not only people friendly, but search engine friendly as well.


Checklist

The goal is to make your website more popular, more visible, more important than the competition.Although it’s not necessary to reach the number one search results position, you need to aim there in order to land anywhere near the top.If your description more closely matches what a viewer is searching for, they’ll go to your website first regardless of what your results position happens to be.It’s not exclusively about position. It’s about targeting a specific keyword and then making certain your website 1) ranks high for that keyword and 2) can deliver what the viewer is searching for.In order to compete with websites in top results positions, you need to find out what they’re doing and then do the same thing, only better.

If you limit your optimization efforts to the top search engines and directories, you can cover the most important SEO bases simultaneously.Take advantage of all the free SEO webmaster tools that Google and other websites have available.

Use Google Sitemaps to make certain the crawler finds all available pages/Use Google Sitemaps to help get your pages indexed faster.Submit XML sitemaps so you can take advantage of the notification options such as the date a page was last modified and the frequency you anticipate a page will be changed or updated.

Indicating priority only tells how important a page is in relation to all the other pages on your website. It has no bearing on what position your page will hold in search engine results.Use Sitemap Equalizer ( http://www.sitemapequalizer.com ) to create and manage all of your sitemaps.Don’t design your web pages for viewers only. Design them to help search crawlers easily and quickly locate the specific information and keywords that you want your page indexed for.Crawler friendly locations include the page title, high placement < h > tags, and the first paragraphs or sentences of the main text content.Most search engine crawlers can’t extract information from image text, multimedia such as flash and streaming video, pages that require login, PDF files, XML, and Java applets.

Most search engine crawlers have a difficult time with things like frames and dynamically generated content and pages.To satisfy both humans and crawlers, you need to utilize the best keywords, place keywords where they are most effective, use keywords in their proper context, and include the correct amount of keywords throughout 



No comments

Theme images by Bim. Powered by Blogger.