Tuesday, August 16, 2011

5 Website Optimization Mistakes that Kill your Guests

Working with many websites in my career as a professional SEO, I saw a lot of problems for sites that did not in a position in the field of search engines. The following issues are the five biggest mistakes I've seen time and again.

Blocking search engine robots
Google Spider
The robots.txt file is a file that contains instructions for the robots of search engines where you can and can not refer to a specific page, a user can contain. Crawlers to find this file the first time I met with the site. You can stop the robot to visit certain pages or directories on your website with the Robots Exclusion Protocol in the robots.txt file. The error occurs when the webmaster inadvertently creep blocking the root folder of the file and remembers the entire site. If the robots.txt file is a line that looks like Disallow: / then you can block the entire site.

JavaScript Navigation

Many sites use JavaScript to create drop-down, accordion and other navigation systems. This type of navigation can help make it easier for visitors to surf the large Web sites. However, the search engine crawlers may look quite different. The problem with JavaScript is that although there is a fully functional menu for visitors to the site with no connection with the source code. The robots of search engines rely on links in the code of the page to navigate. Disable JavaScript in your web browser and on your website. If you can not see, is the site's navigation of the robot is not good.

Flashing

Flash is a web site and do the utmost simplicity. Whole pages are created in FSearch Engine Optimizationlash movies, images, animations and other elements may, for the user experience lead are included. The downside of all this is the kind of experience is evidence. Like all elements of a website created in Flash to be included in the video, they are visible to search engines. Content and links in Flash are not search engine robots. If your site is built in Flash, look at some of the outside of the Flash movie. Add search to your website and browse all HTML Help Site-robot and know what each side.

301 redirects and canonical labels

301 redirects are used to move permanently to the title of a page to show a new one. Canonical labels are used to identify spam as a series of pages, one side must be listed in the search results. Tags Canonical has 301 lines are used. When used properly, can avoid the label on the canonical indexing of the web page. If the canonical labels must assess what they have. Your site has several categories dynamically generated by the same script, all of which are similar, the only difference of the products shown? Are your pages with many different URLs are displayed? Both situations would be ideal for the use of canonical labels.

Duplicate content or not content

Content quality is the key to good rankings. Google has recognized the importance of a variety of quality content on a website and focus on content has been more clearly emphasized in the updating farmers / panda. The tracks are on the content of what determine a ranking based on the page of search results. Since Google wants to create the best user experience, trying to show that the pages with quality content. Duplicate content is bad content objects for the farmers / Panda updated written. Rejected pages, problems with the quality of content has been moved to the list that the update has been installed Live last month. Show the contents of your website. Are you tired? Is original or copied from another site? Develop content on the site and has very little to their own pages where the content of the letter copied from another source.

No comments:

Post a Comment