Cloaking is the practice of showing one version of a webpage to a search crawler like Bingbot, and another to normal visitors. Showing users different content than to the crawlers can be seen as a spam tactic and be detrimental to your website's rankings and can lead to your site being de-listed from our index. It is therefore recommended to be extremely cautious about responding differently to crawlers as opposed to "regular" visitors and to not cloak as a principle.
LINK SCHEMES, LINK BUYING, LINK SPAMMING
While link schemes may succeed in increasing the number of links pointing to your website, they will fail to bring quality links to your site, netting no positive gains. In fact, manipulating inbound links to artificially inflate the number of links pointed at a website can even lead to your site being delisted from our index.
SOCIAL MEDIA SCHEMES
Like farms are similar to link farms in that they seek to artificially exploit a network effect to game the algorithm. The reality is these are easy to see in action and their value is deprecated. Auto follows encourage follower growth on social sites such as Twitter. They work by automatically following anyone who follows you. Over time this creates a scenario where the number of followers you have is more or less the same as the number of people following you. This does not indicate you have a strong influence. Following relatively few people while having a high follower count would tend to indicate a stronger influential voice.
META REFRESH REDIRECTS
These redirects reside in the code of a website and are programmed for a preset time interval. They automatically redirect a visitor when the time expires, redirecting them to other content. Rather than using meta refresh redirects, we suggest you use a normal 301 redirect.
Duplicating content across multiple URLs can lead to Bing losing trust in some of those URLs over time. This issue should be managed by fixing the root cause of the problem. The rel=canonical element can also be used but should be seen as a secondary solution to that of fixing the core problem. If excessive parameterization is causing duplicate content issue, we encourage you to use the Ignore URL Parameters tool.
When creating content, make sure to create your content for real users and readers, not to entice search engines to rank your content better. Stuffing your content with specific keywords with the sole intent of artificially inflating the probability of ranking for specific search terms is in violation of our guidelines and can lead to demotion or even the delisting of your website from our search results.