f1r3fl3x Posted August 24, 2009 Share Posted August 24, 2009 Hi, I'm really new at seo, so i wonder if my idea will be friendly or not. Here's my idea: I have a webpage that loads content dynamically via ajax calls. The part i wonder is this. I'm going to make the href of the link like this: http://site.com/articles/article1.html and a onClick event that loads the content into the div container (getcontent.php?article=article1). The ajax response will have the same content as the href link, but without the design just the text. This will make the user experience much better, but will it affect seo Quote Link to comment Share on other sites More sharing options...
Adam Posted August 27, 2009 Share Posted August 27, 2009 To be honest it would probably be better for SEO, since there's no design features clogging up the mark-up. However those are the pages the search engines will index, and of course, that's where the user will be taken; content only pages. Quote Link to comment Share on other sites More sharing options...
JonnoTheDev Posted August 27, 2009 Share Posted August 27, 2009 Ajax does not work for SEO! At all! Google cannot interprit javascript, nor will it look at javascript. If /articles/article1.html is spidered this is the url that will be indexed, not the page that is making an ajax call to get content from this source. Search engines spider content that is read from the HTML source. If the content is not visible within the HTML source it is not visible to a spider. Take a look at this tool to see what your pages look like as a spider http://www.smart-it-consulting.com/internet/google/googlebot-spoofer/ Or use the firefox tool http://www.seoforclients.com/blog/marketing/seo/how-to-browse-and-check-like-google-bot.html Quote Link to comment Share on other sites More sharing options...
Adam Posted August 27, 2009 Share Posted August 27, 2009 I think he was meaning to provide a 'href' link for the search engines / JS-disabled users to follow, which would mean the content got indexed, and for the rest call the onclick event (but return false to stop the browser following the link). Quote Link to comment Share on other sites More sharing options...
JonnoTheDev Posted August 27, 2009 Share Posted August 27, 2009 I think he was meaning to provide a 'href' link for the search engines / JS-disabled users to follow, which would mean the content got indexed Would the page containing the content still have the navigation that is required to browse the site. If these pages get indexed they are the pages that users will be landing on. Using AJAX to display content with DOM that is important for SEO is a really bad idea. Why not load the content onto the page from the start (not with AJAX, i.e. sql query) and just use the DOM to hide it from users until an event handler is clicked. The fact that the content will still be viewable in the HTML source makes it SEO friendly. Quote Link to comment Share on other sites More sharing options...
Adam Posted August 27, 2009 Share Posted August 27, 2009 Would the page containing the content still have the navigation that is required to browse the site. If these pages get indexed they are the pages that users will be landing on. Yeah that was my point here... However those are the pages the search engines will index, and of course, that's where the user will be taken; content only pages. Quote Link to comment Share on other sites More sharing options...
JonnoTheDev Posted August 27, 2009 Share Posted August 27, 2009 sorry misread Quote Link to comment Share on other sites More sharing options...
Adam Posted August 27, 2009 Share Posted August 27, 2009 Heh no worries. There is a solution though 'f1r3fl3x'. When you make the AJAX request add a parameter to the URL, something like "ajax=1", which within your code you can then use to determine whether or not apply the design. Obviously then the search engines will take them to the design in-tact version, and the AJAX request will only return the content. Quote Link to comment Share on other sites More sharing options...
JonnoTheDev Posted August 27, 2009 Share Posted August 27, 2009 When you make the AJAX request add a parameter to the URL, something like "ajax=1", which within your code you can then use to determine whether or not apply the design. Obviously then the search engines will take them to the design in-tact version, and the AJAX request will only return the content. You are talking about basic cloaking. This would work by determining the user agent. Quote Link to comment Share on other sites More sharing options...
f1r3fl3x Posted August 27, 2009 Author Share Posted August 27, 2009 Thanks for the replies guys! But i think you didn't understand me :/ I really can't describe my idea very good becouse i'm not english But code is universal and i'll show you what i tried to describe in my previous post article108.html //using mod_rewrite to make urls more friendly //Design and stuff <div id="content"> This article is about bla bla bla bla ... There's an artical that's related to this one. Here's the LINK. </div> //design and stuff And the link is this <a href="article1123.html" onClick="loadArticle(1123);"> The bots, as you said ignore javascript. So they will follow the link article1123.html, which is a page of the site like any other with design, navigation and the content of the article. article1123.html //design and stuff <div id="content"> this is the content of article 1123.... </div> //design and stuff On the other hand the users, when they click on the link the js event will be fired, the next article will be dinamycally loaded in the content div So in theory this will work. The users will quickly load the articles via ajax+json calls, and the bots will index pages normaly, without knowing that there's a dynamic content on the page. I hope you understood what my idea is this time and i'll be glad to discuss the issues in this method Quote Link to comment Share on other sites More sharing options...
JonnoTheDev Posted August 27, 2009 Share Posted August 27, 2009 Should be fine. Quote Link to comment Share on other sites More sharing options...
f1r3fl3x Posted August 27, 2009 Author Share Posted August 27, 2009 Wow, thanks for the quick reply. Quote Link to comment Share on other sites More sharing options...
Brandon_R Posted September 20, 2009 Share Posted September 20, 2009 Google can follow javascript links now. It is evolving guys keep up Quote Link to comment Share on other sites More sharing options...
f1r3fl3x Posted September 20, 2009 Author Share Posted September 20, 2009 Google can follow javascript links now. It is evolving guys keep up OK, but if i make all the links with an id, and then when the page loads i add the onClick event dynamically would google follow that link? I think it wont, becouse google can't read javascript, or i'm wrong ? Quote Link to comment Share on other sites More sharing options...
Derleek Posted September 21, 2009 Share Posted September 21, 2009 so... i take it sql queries are readable by web-crawlers? Is it possible to make Ajax SEO friendly w/o having a physical .html file where the articles reside? Take wordpress or Joomla for example. I know wordpress does not store individual blog entries like this - and to my knowledge neither does Joomla. Joomla is (somewhat) SEO friendly if installed and configured properly. Pretty sure blog posts are SEO friendly through wordpress. I realize this is probably difficult to achieve... but if sql queries are SEO friendly it seems like having the directory (and the individual articles) are a bit unnecissary. Theoretically, (again, assuming sql queries are able to be read by crawlers) one could create an articleIndex.php file which queries all of the articles in a mysql table... right? Not sure though... could be off base - in which case, how are joomla/wordpress/whatever CMS SEO friendly? Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.