Top 5 FREE WordPress Plugins for SEO

WordPress is without a question the most popular stand-alone blog platform. It is flexible and customizable; there are lots of useful plugins providing any functionality a blogger can think of. However, a fresh installation of a WordPress blogs leaves a lot for improvement. For instance, search engine optimization and duplicate content proofing.

With WordPress SEO Plugins you can more optimize your blog for search engines

1) HeadSpace2 SEO
Link: http://urbangiraffe.com/plugins/headspace2/
HeadSpace is a powerful all-in-one plugin to manage meta-data and handle a wide range of SEO tasks. With it you can tag your posts, create custom titles and descriptions that improve your page ranking, change the theme or run disabled plugins on specific pages, and a whole lot more.

Because the configuration of meta-data can be a complicated and tiresome process HeadSpace provides several shortcuts to reduce your effort:

* Meta-data nesting – data is collected not only from the page itself, but nested parent pages
* Dynamic data extracted – why repeat yourself when you can extract data from the post itself?
* Full GUI interface – data is entered alongside post content, and with a full auto-suggested AJAX interface for tags and keywords
* Mass-editing – now you can edit meta-data for all pages and posts at one go!

2) All in One SEO Pack
Link: http://semperfiwebdesign.com/
Features:

* Advanced Canonical URLs
* Fine tune Page Navigational Links
* Built-in API so other plugins/themes can access and extend functionality
* ONLY plugin to provide SEO Integration for WP e-Commerce sites
* Nonce Security
* Support for CMS-style WordPress installations
* Automatically optimizes your titles for search engines
* Generates META tags automatically
* Avoids the typical duplicate content found on WordPress blogs
* For beginners, you don’t even have to look at the options, it works out-of-the-box. Just install.
* For advanced users, you can fine-tune everything
* You can override any title and set any META description and any META keywords you want.
* Backward-Compatibility with many other plugins, like Auto Meta, Ultimate Tag Warrior and others.

3) SEO Friendly Images
Link: http://www.prelovac.com/vladimir/wordpress…friendly-images

SEO Friendly Images is a WordPress optimization plugin which automatically updates all images with proper ALT and TITLE attributes. If your images do not have ALT and TITLE already set, SEO Friendly Images will add them according the options you set. Additionally this makes the post W3C/xHTML valid as well.

ALT attribute is important part of search engine optimization. It describes your image to search engine and when a user searches for a certain image this is a key determining factor for a match.

TITLE attribute play lesser role but is important for visitors as this text will automatically appear in the tooltip when mouse is over the image.

4) Platinum SEO Pack
Link: http://techblissonline.com/platinum-seo-pack/
Features:

* Optimized Post and Page Titles for search engines
* Generates all SEO relevant META tags automatically
* Helps you avoid duplicate content
* Lets you override any title and set any META description and META keywords, for any post or page
* Compatible with most other plugins, like Auto Meta, Ultimate Tag Warrior and others.However you may have to disable All in One SEO pack
* You don’t have to fear changing permalinks. If you are not satisfied with the current permalink, change it through Settings–>Permalinks in your admin panel, without worrying about loss of Page rank or google penalty.Platinum SEO plugin will take care of issuing a 301 redirect to the new location.This is a new essential feature, not present in All in one SEO
* Add meta description and meta keywords tags to WordPress Categories and WordPress Tag pages.
* Add index, noindex, follow or nofollow, noarchive, nosnippet, noodp, noydir meta tags to any post/page.These options are not available in All in one SEO Pack.

5) Extreme SEO
Link: http://www.seolinknet.com/

This plugin has several functions that are all run without any effort on your part.

1. Provides your site with one way, content related, dofollow inbound links from other sites in the extremeseo network.
2. Provides links to related posts under your post content that open in a new browser window so you dont lose your visitors upon clicking the related post links.
3. Increases your site’s search engine ranking and Google Page Rank by providing static inbound links to your posts from related posts on other sites within the extremeseo network.

Top 10 tips for high link popularity

Link popularity has become an important factor in the ranking algorithms of most search engines. If many quality web sites link to your site, you’ll have a good position in the search engines.

10. Cross-link your sites

An easy way to improve your link popularity is to add a link from your own popular web site to a less popular site of your own.

However, it’s very important that one site is not only a duplicate of the other and that both sites are related to each other.

9. Give testimonials
Everyone has some favorite software tools and utilities. Contact the publishers or developers and explain why you like their software programs. Some of them will request a permission to display your testimonial on their web site, along with a link to your site.

You can also contact web sites that don’t offer software programs but interesting articles. Just make sure that you really stand behind what you say as it can backfire to you if you recommend bad products.
8. Awards for you and other sites
You can submit your web site to sites who offer awards. If they reward you their award, then they’ll probably link to your site. Just search Google for “awards directory your-keyword” (replace “your-keyword” with a keyword that is important to your business).

You can also offer an award for other web sites. Submit your awards page to the above-mentioned awards directories to make it popular. The web sites who receive your award will link back to your site.
7. Participate in newsgroups and discussion forums
When you post to newsgroups and discussion forums, add a signature file with a link to your web site.

An easy way to post messages in newsgroups is to create a Google account and to post via their web interface:
http://groups.google.com

To find good discussion forums, just search Google for “forums directory your-keyword”. For example, if you want to participate in a marketing forum, search Google for “forums directory marketing”.
6. Provide a link directory
Offer a link directory on your web site which links to quality web sites with a similar topic. Your visitors will appreciate such a resource and you’ll provide incentive for other sites to link to your site.

You can add a “link to us” page to your web site on which you promise to add other complementary sites to your link directory if they link back to you first.

5. List your web site in the Yahoo directory
If you have a business web site, you might want to pay US$299 yearly to Yahoo to have your web site listed in the Yahoo directory. The Yahoo directory is independent from Yahoo’s main search engine:
http://docs.yahoo.com/info/suggest/busexpress.html

If you have a non-business site, you can submit to Yahoo without paying: http://docs.yahoo.com/info/suggest/

4. List your web site in the Open Directory Project (ODP) directory
If your web site is listed in the popular directory ODP, then your web site will have a good base for the link popularity.

Rumor has it that Google starts crawling web sites with the web sites listed in the ODP directory. Best of all, you can add your web site to ODP without paying a fee.

3. List your web site in regional and industry-specific directories.
A link in regional and industry specific directories is good for your web site because your link will appear on a related page. Links from related pages are much better than links from unrelated pages.
2. Write articles for your audience
Write articles about the topics that your web site visitors are interested in. After a while, other web sites will link back to your articles.

You can also send your articles to syndicate web sites. They’ll offer your articles to other web sites. Just make sure that your article contains a link to your web site so that other sites who publish your article will automatically link to you.
1. Get reciprocal links from complementary sites
Search the major search engines for your keywords. They will return many web sites that don’t compete with your site, or which offer complementary things.

Visit their site and then write them an email message to request a link to your site. If you link back to them first, they’ll be more likely to link to you.

Top 10 Joomla SEO tips for Google

Its been while since I have done much blogging about Joomla SEO, almost a year in fact. I thought I would do some research and see how the collective wisdom of the industry has changed with what matter to get a page search engine optimized (SEO). Use this top ten list if you already have a site ranked and want to see what you can implement to get your ranking higher.

How to search engine optimize your Joomla website in 10 easy steps.

Its been while since I have done much blogging about Joomla SEO, almost a year in fact. I thought I would do some research and see how the collective wisdom of the industry has changed with what matter to get a page search engine optimized (SEO). This isn’t exhaustive, for that you should check out my Complete guide to Joomla SEO or Alledia’s SEO guide. Use this top ten list if you already have a site ranked and want to see what you can implement to get your ranking higher. Much of the information here is based on two 2007 studies about ranking in Google from SEOmoz.org and Sistrix.

1. Keyword Use in Title Tag

and appear in the blue bar of your browser.The number one factor in ranking a page on search engines is the title tag. These are the words in the source of a page in

Choose the title of an article very carefully. Joomla will use the title of the article in the title tag (what appears in the blue bar). It will also be the text used in any insite links (see #5 and 6)

read more here

Some Helpfull Tools to find broken links in your website

When you are working on many pages lets say 100 to 1000 pages, some times we do mistakes that we forgot to give links for the respective URL’s – it turns to a Broken Link (nothing but a 404 error). It will become a nightmare for the webmaster to get hold of these broken links from those 100 pages. To get rid of these types of head-aches, we have online web tools as well as desktop tools which tells us about where these broken links exists in a page or number of pages. You can easily find out and rectify these types of errors.

Below listed are most widely used tools by most of web masters, and I would like to recommend these tools since this going to help us a lot in finding out these errors very easily:

1. W3C Link Checker (Web Tool)

This is a web tool launched by W3C, this Link Checker looks for issues in links, anchors and referenced objects in a Web page, or recursively on a whole Web site. Link checker is part of the W3C’s validators and Quality Web tools – W3C Link Checker

2. Web Link Validator (Desktop Tool)

This Web Link Validator checks for Site Integrity, Syntax Validation, Automatic Reporting and it runs on any system that is with Windows 98/Me/NT/2000/XP/2003/Vista – Web Link Validator

3. Xenu’s Link Sleuth (Desktop Tool)

It check websites for broken links and links verification is also done on “normal” links, images, frames, plug-ins, backgrounds, local image maps, style sheets, scripts and java applets. It has got many more additional features, if you want to know more about the features, please to click here for the same.

4. Dead-Links.com – Free Broken Link Checker (Web Tool)

Put in your website or weblog home page URL in the input box and a little spider will read the html code and check for the broken links – Dead-Links.com – Free Broken Link Checker

5. LinkChecker 0.6.3 by Kevin Freitas (Browser Add-on)

LinkChecker 0.6.3 is a very good firefox add-on, it check the validity of links of any webpage – LinkChecker 0.6.3

6. LinkTiger (Web Tool)

LinkTiger has got very good features like there is no software installation, just register and find out whether your site has got any broken links, it even scans CSS, PDF, Flash, & MS Office files – LinkTiger

7. Link Checker Pro (Desktop Tool)

Link Checker Pro is the leading solution for website analysis and the detection of broken and other problem links. Link Checker Pro combines powerful features and an easy to use interface and is robust enough to deal with corporate websites containing 100,000 links or more.

Link Checker Pro

Hope the above listed tools may help you in finding out broken link in your site.

HTTP status codes

When a request is made to your server for a page on your site (for instance, when a user accesses your page in a browser or when Googlebot crawls the page), your server returns an HTTP status code in response to the request.

This status code provides information about the status of the request. This status code gives Googlebot information about your site and the requested page.

Some common status codes are:

  • 200 – the server successfully returned the page
  • 404 – the requested page doesn’t exist
  • 503 – the server is temporarily unavailable

A complete list of HTTP status codes is below. Click the link for more information. You can also visit the W3C page on HTTP status codes for more information.

1xx (Provisional response)
Status codes that indicate a provisional response and require the requestor to take action to continue.

Code Description
100 (Continue) The requestor should continue with the request. The server returns this code to indicate that it has received the first part of a request and is waiting for the rest.
101 (Switching protocols) The requestor has asked the server to switch protocols and the server is acknowledging that it will do so.

2xx (Successful)

Status codes that indicate that the server successfully processed the request.

Code Description
200 (Successful) The server successfully processed the request. Generally, this means that the server provided the requested page. If you see this status for your robots.txt file, it means that Googlebot retrieved it successfully.
201 (Created) The request was successful and the server created a new resource.
202 (Accepted) The server has accepted the request, but hasn’t yet processed it.
203 (Non-authoritative information) The server successfully processed the request, but is returning information that may be from another source.
204 (No content) The server successfully processed the request, but isn’t returning any content.
205 (Reset content) The server successfully proccessed the request, but isn’t returning any content. Unlike a 204 response, this response requires that the requestor reset the document view (for instance, clear a form for new input).
206 (Partial content) The server successfully processed a partial GET request.

3xx (Redirected)
Further action is needed to fulfill the request. Often, these status codes are used for redirection. Google recommends that you use fewer than five redirects for each request. You can use Webmaster Tools to see if Googlebot is having trouble crawling your redirected pages. The Crawl errors page under Diagnostics lists URLs that Googlebot was unable to crawl due to redirect errors.

Code Description
300 (Multiple choices) The server has several actions available based on the request. The server may choose an action based on the requestor (user agent) or the server may present a list so the requestor can choose an action.
301 (Moved permanently) The requested page has been permanently moved to a new location. When the server returns this response (as a response to a GET or HEAD request), it automatically forwards the requestor to the new location. You should use this code to let Googlebot know that a page or site has permanently moved to a new location.
302 (Moved temporarily) The server is currently responding to the request with a page from a different location, but the requestor should continue to use the original location for future requests. This code is similar to a 301 in that for a GET or HEAD request, it automatically forwards the requestor to a different location, but you shouldn’t use it to tell the Googlebot that a page or site has moved because Googlebot will continue to crawl and index the original location.
303 (See other location) The server returns this code when the requestor should make a separate GET request to a different location to retrieve the response. For all requests other than a HEAD request, the server automatically forwards to the other location.
304 (Not modified) The requested page hasn’t been modified since the last request. When the server returns this response, it doesn’t return the contents of the page.

You should configure your server to return this response (called the If-Modified-Since HTTP header) when a page hasn’t changed since the last time the requestor asked for it. This saves you bandwidth and overhead because your server can tell Googlebot that a page hasn’t changed since the last time it was crawled

.

305 (Use proxy) The requestor can only access the requested page using a proxy. When the server returns this response, it also indicates the proxy that the requestor should use.
307 (Temporary redirect) The server is currently responding to the request with a page from a different location, but the requestor should continue to use the original location for future requests. This code is similar to a 301 in that for a GET or HEAD request, it automatically forwards the requestor to a different location, but you shouldn’t use it to tell the Googlebot that a page or site has moved because Googlebot will continue to crawl and index the original location.

4xx (Request error)
These status codes indicate that there was likely an error in the request which prevented the server from being able to process it.

Code Description
400 (Bad request) The server didn’t understand the syntax of the request.
401 (Not authorized) The request requires authentication. The server might return this response for a page behind a login.
403 (Forbidden) The server is refusing the request. If you see that Googlebot received this status code when trying to crawl valid pages of your site (you can see this on the Web crawl page under Diagnostics in Google Webmaster Tools), it’s possible that your server or host is blocking Googlebot’s access.
404 (Not found) The server can’t find the requested page. For instance, the server often returns this code if the request is for a page that doesn’t exist on the server.

If you don’t have a robots.txt file on your site and see this status on the robots.txt page of the Diagnostic tab in Google Webmaster Tools, this is the correct status. However, if you do have a robots.txt file and you see this status, then your robots.txt file may be named incorrectly or in the wrong location. (It should be at the top-level of the domain and named robots.txt.)

If you see this status for URLs that Googlebot tried to crawl (on the HTTP errors page of the Diagnostic tab), then Googlebot likely followed an invalid link from another page (either an old link or a mistyped one).

405 (Method not allowed) The method specified in the request is not allowed.
406 (Not acceptable) The requested page can’t respond with the content characteristics requested.
407 (Proxy authentication required) This status code is similar 401 (Not authorized); but specifies that the requestor has to authenticate using a proxy. When the server returns this response, it also indicates the proxy that the requestor should use.
408 (Request timeout) The server timed out waiting for the request.
409 (Conflict) The server encountered a conflict fulfilling the request. The server must include information about the conflict in the response. The server might return this code in response to a PUT request that conflicts with an earlier request, along with a list of differences between the requests.
410 (Gone) The server returns this response when the requested resource has been permanently removed. It is similar to a 404 (Not found) code, but is sometimes used in the place of a 404 for resources that used to exist but no longer do. If the resource has permanently moved, you should use a 301 to specify the resource’s new location.
411 (Length required) The server won’t accept the request without a valid Content-Length header field.
412 (Precondition failed) The server doesn’t meet one of the preconditions that the requestor put on the request.
413 (Request entity too large) The server can’t process the request because it is too large for the server to handle.
414 (Requested URI is too long) The requested URI (typically, a URL) is too long for the server to process.
415 (Unsupported media type) The request is in a format not support by the requested page.
416 (Requested range not satisfiable) The server returns this status code if the request is for a range not available for the page.
417 (Expectation failed) The server can’t meet the requirements of the Expect request-header field.

5xx (Server error)
These status codes indicate that the server had an internal error when trying to process the request. These errors tend to be with the server itself, not with the request.

Code Description
500 (Internal server error) The server encountered an error and can’t fulfill the request.
501 (Not implemented) The server doesn’t have the functionality to fulfill the request. For instance, the server might return this code when it doesn’t recognize the request method.
502 (Bad gateway) The server was acting as a gateway or proxy and received an invalid response from the upstream server.
503 (Service unavailable) The server is currently unavailable (because it is overloaded or down for maintenance). Generally, this is a temporary state.
504 (Gateway timeout) The server was acting as a gateway or proxy and didn’t receive a timely request from the upstream server.
505 (HTTP version not supported) The server doesn’t support the HTTP protocol version used in the request.

How to Optimise Flash Sites..a brief note…

If there is a really hot potato that divides SEO experts and Web designers, this is Flash. Undoubtedly a great technology to include sounds and picture on a Web site, Flash movies are a real nightmare for SEO experts. The reason is pretty prosaic – search engines cannot index (or at least not easily) the contents inside a Flash file and unless you feed them with the text inside a Flash movie, you can simply count this text lost for boosting your rankings. Of course, there are workarounds but until search engines start indexing Flash movies as if they were plain text, these workarounds are just a clumsy way to optimize Flash sites, although certainly they are better than nothing.

Why Search Engines Dislike Flash Sites?

Search engines dislike Flash Web sites not because of their artistic qualities (or the lack of these) but because Flash movies are too complex for a spider to understand. Spiders cannot index a Flash movie directly, as they do with a plain page of text. Spiders index filenames (and you can find tons of these on the Web), but not the contents inside.

Flash movies come in a proprietary binary format (.swf) and spiders cannot read the insides of a Flash file, at least not without assistance. And even with assistance, do not count that spiders will crawl and index all your Flash content. And this is true for all search engines. There might be differences in how search engines weigh page relevancy but in their approach to Flash, at least for the time beings, search engines are really united – they hate it but they index portions of it.

Continue reading

Top 10 tips for high link popularity

Link popularity has become an important factor in the ranking algorithms of most search engines. If many quality web sites link to your site, you’ll have a good position in the search engines.

10. Cross-link your sites

    An easy way to improve your link popularity is to add a link from your own popular web site to a less popular site of your own.

    However, it’s very important that one site is not only a duplicate of the other and that both sites are related to each other. For example, CheckYourLinkPopularity.com .

    Continue reading