Wednesday, March 28, 2012

Why web robots cant request aspx pages in Ajax project ?

When Icreate aspx pages in Ajaxproject googlebot, yahoo-slurp and other crawlers can’t see these pages.

I’vecreated two test sites to show the problem:

Pagewithout Ajax:http://test.cky.pl/noajax/Default.aspx

Page with Ajax:http://test.cky.pl/ajax/Default.aspx

In browserI can see both of these pages. But when I try to saw what web robot can seewhen it requests my page (I was using this page and other tools (result wasthis same):http://www.smart-it-consulting.com/internet/google/googlebot-spoofer/index.htm

For Ajax page I was gettingerror massage:Invalid URL or serverdoes not respond, HTTP return code: 500

For normalaspx page everything is ok.

If someone couldshow me solution for this problem and could explain me why robots can’t requestAjax pages Iwould be very thankful.

Have you found a solution for this yet? I also noticed the same problem. If you point the W3C link validator at an AJAX site, it just says HTTP 500 internal server error. You get the same problem if you use an link checking package such as Xenu. If you view the page in your browser, it's fine.

This concerns me as I'm heavily involved in SEO - will Googlebot see the same and not index the site...


OK, found a workaround. Add this to Page.Init:

if (Request.Browser.Crawler ==true || Request.Browser.W3CDomVersion.ToString() !="1.0" || Request.Browser.Type.ToString().Contains("Opera")){ ScriptManager1.EnablePartialRendering =false;}else{ ScriptManager1.EnablePartialRendering =true;}

Thanks a lot, it's helps.

No comments:

Post a Comment