GB_001 Posted July 16, 2008 Share Posted July 16, 2008 Hello, I was wondering if there is a solution to the problem search spiders are having with full AJAX sites, for example googlebot seemingly does not support AJAX sites very well. Is there a work around to this? Thankyou, -GB. Quote Link to comment Share on other sites More sharing options...
PC Nerd Posted July 16, 2008 Share Posted July 16, 2008 not that i know of. you might want to look at attempting to see if it is a google bot - on the server, and then just downloading all teh page for the bot to catalog... then the next client to the site would receive the AJAX version. also - not all browsers support JavaScript or people have it turne off as im sure your aware.. so make sure your site is viewable withougth javascript anyway..... AJAX included. Quote Link to comment Share on other sites More sharing options...
GB_001 Posted July 16, 2008 Author Share Posted July 16, 2008 Thanks, I don't know if I'd want to redo my site... but I have seen sites that were almost full Ajax and were high ranked with google. For example facebook. Quote Link to comment Share on other sites More sharing options...
PC Nerd Posted July 17, 2008 Share Posted July 17, 2008 hmmm, well i would think about doing a <noscript> section - so that in case browsers dont have Javascript/AJAX support ( inc bots) - it displays generic text.... so that its still detailed for search engines. the other thing you might want to do for bots is in your bots.txt file, have it go to a single path - where you can select information designed only for bots. Although this can be an issue if bots detect that your site is providing false information..... however its somethign you might want to look into. gldk Quote Link to comment Share on other sites More sharing options...
mainewoods Posted July 18, 2008 Share Posted July 18, 2008 google does not index any content delivered with javascript and that includes ajax. the googlebot does not execute any javascript so it doen't see it. to see what the googlebot sees, turn off all javascript in your browser and then visit your site. If any of your links are now dead links because they are javascript ativated then that is what the googlebot sees. I have seen sites with all links on the home page dead when javascript is turned off. googlebot will see that site as only a one page site! Using a lot of ajax in your web pages is seo difficult. One trick you can do is always provide an actual real href= for links that normally load from ajax: <a href="sitename.com/directory/filename.php" onclick="doajaxload();return false;">load content</a> googlebot will only see the href, it will not see the onclick. that filename.php would have to be a real page with the real content for the spiders to munch on. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.