Jump to content

JonnoTheDev

Staff Alumni
  • Posts

    3,584
  • Joined

  • Last visited

  • Days Won

    3

Everything posted by JonnoTheDev

  1. HTTP is static. Once you have requested a url the output is sent to the browser. The output can only change once the page is requested again (i.e user clicks refresh). An option is to poll the server via an AJAX request however I fail to see why a user would stay on the same web page for a period of 5 minutes.
  2. This is a simple REST service. Use curl to make the request to: http://api.tinychat.com/ROOMNAME.xml?password=PASS http://uk2.php.net/manual/en/function.curl-exec.php The response is XML so parse it using simplexml $xml = simplexml_load_string($response); print_r($xml); http://uk2.php.net/manual/en/simplexml.examples.php
  3. Please say you have a backup
  4. Interesting
  5. The screen would have to be captured at the point the page is requested for the session containing the captcha answer to be set and the graphic to be displayed. Multiple requests cannot be made as the captcha graphic will change. Once the captcha image is recognised the results must be submitted into the form with the session still active.
  6. For images http://www.mathworks.com/access/helpdesk/help/toolbox/images/index.html?/access/helpdesk/help/toolbox/images/index.html
  7. Yes, if you were to follow this target you would end up with a different image generated to the one on the initial form and another session value would be set. That is why I mentioned capturing the screen when the form page is requested in the inpital post process.
  8. For your perusal guys: http://network-security-research.blogspot.com/
  9. You obviously think im stupid. If it was a simple as using file_get_contents I wouldnt be posting. Captchas require sessions. How can file_get_contents set a session? Curl would be making the requests (thats if I did use php). Ive not asked for you to build this for me so the process I laid out is obvious to everyone. Don't be so condescending in your reply. If you have no valid contribution to the topic then dont post.
  10. As an experiment (honestly). I am looking at how difficult it is to break captchas by creating a recognition tool. There are a few considerations before getting onto character recognition from the image such as obtaining the image from the screen (most captchas use a server side script to generate the image rather than saving an image in a directory which also sets the soultion in a session). I'm thinking that the tasks required for the whole process maybe out of the scope of php (or at least parts). Process 1. Request url of form 2. Capture screen as graphic 3. Crop image to just captcha graphic 4. Character recognition process on graphic 5. Return result 6. Complete required form input fields 7. Submit 8. Obtain response Thoughts?
  11. Read my above post. The code is there for you to manipulate. However passing strings in this fashion through the URL is poor but hey it's your project.
  12. This is oscommerce isn't it. The serialized session contains an object of cart. The session handler will unserialize it. Why are you attempting to do this yourself.
  13. You should join tables to reduce the number of queries. You have 5 queries within the loop of the result of an initial query. i.e. If the first query returned 500 records then that script would be running 1+(500*5) queries. 2501 queries in total. Crazy.
  14. Because you would need to explode the values in the url as an array. If you use strings you have no delimeter. However ids could be comma separated i.e. mysite.com/api.php?category=1,2,75 <?php $categories = explode(",",$_GET['category']); $validUrlCategories = array(); foreach($categories as $category) { if(in_array($category, array(1,2,3,4,75))) { $validUrlCategories[] = $category; } } if(count($validUrlCategories)) { print "the url contains the following valid categories: ".implode(",", $validUrlCategories); } else { print "no valid categories in url"; } ?>
  15. Should be fine.
  16. Category is in the GET array. It is not global! <?php if(in_array(urldecode($_GET['category']), array('sports', 'performance', 'organization', 'random', 'surrounding'))) { print "valid"; } else { print "invalid"; } ?> mysite.com/api.php?category=sports+performance would return 'sports performance'. This is not a category! You should not pass category names through the url in this fashion especially as strings! Use their database ids
  17. You will not be able to use php's mail function. It cannot connect to a SMTP server that requires authentication as this cannot be set within your ini file. Since it is a local pc you have no mail transfer agent (mta). Download and use the swiftmailer libraries http://swiftmailer.org/ since you can configure a SMTP server. Read the docs on usage.
  18. Actually the urls may look like another directory but in fact could be mod-rewritten urls or implementing the MVC pattern. If it's a simple site like you have stated then just use files in the document root. Not subdirectories. i.e /docs/ index.php services.php contact.php about.php No need to make complicated.
  19. You are talking about basic cloaking. This would work by determining the user agent.
  20. I haven't the time at the mo to look over the script in its entirety but I may have a look on the weekend. Bookmarked.
  21. sorry misread
  22. Would the page containing the content still have the navigation that is required to browse the site. If these pages get indexed they are the pages that users will be landing on. Using AJAX to display content with DOM that is important for SEO is a really bad idea. Why not load the content onto the page from the start (not with AJAX, i.e. sql query) and just use the DOM to hide it from users until an event handler is clicked. The fact that the content will still be viewable in the HTML source makes it SEO friendly.
  23. Stay away from off the shelf CMS systems! Using a framework will mean that you wont have to build many of the lower level modules yourself such as database abstraction, file uploading, image manipulaion, user authentication, etc as they should be available to you. However you will need to study the framework documentation.
  24. Ajax does not work for SEO! At all! Google cannot interprit javascript, nor will it look at javascript. If /articles/article1.html is spidered this is the url that will be indexed, not the page that is making an ajax call to get content from this source. Search engines spider content that is read from the HTML source. If the content is not visible within the HTML source it is not visible to a spider. Take a look at this tool to see what your pages look like as a spider http://www.smart-it-consulting.com/internet/google/googlebot-spoofer/ Or use the firefox tool http://www.seoforclients.com/blog/marketing/seo/how-to-browse-and-check-like-google-bot.html
  25. Why are you using a form? Form buttons will submit the page!
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.