Skip to content Skip to sidebar Skip to footer

How To Tell Google Bot To Skip Part Of Html?

There is much info about opposite situation, when people try to have stuff in HTML, that is visible to Google bots, but not visible to users, in my case, I need opposite thing - to

Solution 1:

Maybe a base64 encoding server side and then decoding on the client side could work?

Code:

<!-- visible to Google --><p> Hi, Google Bot! </p><!-- not visible from here on --><scripttype="text/javascript">document.write ("<?phpecho base64_encode('<b>hey there, user</b>'); ?>");
</script>

How it looks to the bot:

<!-- visible to Google --><p> Hi, Google Bot! </p><!-- not visible from here on --><scripttype="text/javascript">document.write (base64_decode("B9A985350099BC8913=="));
</script>

Solution 2:

Create a Div, Load the content of the Div (ajax) from an html file which resides in a directory protected by robots. Example. /index.html

Somewhere on the header. (check http://api.jquery.com/jQuery.ajax/ )

$.ajax({
  url: '/hiddendirfrombots/test.html',
  success: function(data) {
    $('#hiddenfrombots').html(data);
  }
});

... somewhere in the body

<div id="hiddenfrombots"></div>

create a directory "hiddenfrombots" and put the followin in the roots .htaccess

User-agent: *
Disallow: /hiddenfrombots/

Solution 3:

This should do the Trick:

<!--googleoff: index--><p>hide me!</p><!--googleon: index-->

For more information check out the link to Googles page that describes it in more depth.

Excluding Unwanted Text from the Index

Solution 4:

If you can use PHP, just output your content if not Googlebot:

// if not googleif(!strstr(strtolower($_SERVER['HTTP_USER_AGENT']), "googlebot")) { 
    echo$div;
}

That's how I could solve this issue.

Solution 5:

  • Load your content via an Ajax call
  • But create a JS file (e.g.: noGoogleBot.js) that contains the function that implements the ajax call:

    $.ajax({
      url: 'anything.html',
      success: function(data) {
        $('#anywhere').html(data);
      }
    });
    

Then in your robots.txt

User-agent: *Disallow: /noGoogleBot.js

So all the divs that are loaded using the function in noGoogleBot will be blocked. Googlebot (or any other crawler) will ignore the content of noGoogleBot.js.

Post a Comment for "How To Tell Google Bot To Skip Part Of Html?"