The popularity of the Internet as a medium of mass communication and marketing has invited some malpractices which are aimed at increasing the popularity of a website. One such practice is known as spamdexing, where webmasters create web pages to increase page rankings without paying attention to the content or relevance of the site. Spamdexing is considered an unethical method of promoting a website and increasing its traffic.

There are several internet techniques which are considered spamdexing. One of the most popular ways to spamdex is to stuff pages with keywords. Although using keywords is one of the most tried and true methods of on-page optimization which is legitimate, in the case of spamdexing the web pages do not contain any useful information related to the keywords and most of the pages fail to make any sense at all.

Spamdexing can also be accomplished by stuffing metatags into the web pages. It works similarly to keyword stuffing except that the keywords are found in the metatags. Another way to spamdex that is being quite popularly used these days is article spinning. Here webmasters rewrite existing article (as opposed to merely scraping content from other sites) to avoid penalties imposed by search engines for duplicate content. This automated tool uses a thesaurus database or content writers who write multiple versions of the same content.

Some webmasters also perform what is known as ‘page hijacking,’ where rogue copies of popular websites are created with the original content in order to fool search engine crawlers into thinking their site is something it’s not just in order to get traffic. The searcher is then redirected to the unrelated or malicious websites.

Creating fake blogs for the sole purpose of spamming – known as autoblogs – is also considered a form of spamdexing. Each day tens of thousands of web pages are being created using spamdexing.

Major search engines have taken measures to counter the menace of spamdexing. Google has been a forerunner in this respect. They have developed "intelligent" spiders to ensure that websites which make use of spamdexing cannot fool them by keyword stuffing and metadata stuffing. In order to curb link based spamming techniques, they have added extra countermeasures such as tracking the number of incoming and outgoing links.

It is important that you know what is legitimate and what is considered unethical when it comes to website development. You should not land into the troubled territory based on your ignorance. The best way to prevent spamdexing is to ensure that you do not overdo when it comes to optimizing your website.

Summary: Spamdexing is an unethical way of optimizing a website which is aimed at increasing its popularity. It can be accomplished through stuffing keywords and meta tags, page hijacking and link spamming.


Content Copyright 2010 - 2015 --- All rights reserved. - sitemap