Jump to content

stijnvb

Members
  • Posts

    21
  • Joined

  • Last visited

    Never

Profile Information

  • Gender
    Not Telling

stijnvb's Achievements

Newbie

Newbie (1/5)

0

Reputation

  1. Is this all the code you are using? It seems you might want to use mysql_query($sqlpartnersbodytype); in order for it to do anything at all :-)
  2. Hi Crayon, Now that's what I call a solution! :-) That does exactly what I need!!! Thanks
  3. Got home and put myself to it I got exactly what I needed, but I think this could be done so much shorter. Anyway, this might be useful for you: <?php // Split array into sentences (this removes the . after each sentence $tot_article_array = explode('. ', $tot_article); $current_add_period = 0; // Add the . after each sentence again while ($current_add_period < count ($tot_article_array)){ $tot_article_array[$current_add_period] .= "."; $current_add_period++; } $current_combine_count = 0; $trunk_array_count = 0; $trunk_array = array(); // Add sentences together to a string closest to 1000 characters while ($current_combine_count < count ($tot_article_array)){ if (strlen($trunk_array[$trunk_array_count]) + strlen($tot_article_array[$current_combine_count]) < "1000") { $trunk_array[$trunk_array_count] .= " " . $tot_article_array[$current_combine_count]; } else { $trunk_array_count++; } $current_combine_count++; } ?>
  4. Hi guys, I have long strings which I want to cut in smaller strings into an array. As an example, they should all be less than 2000 characters, and cut off to the dot (.) closest to the 2000th character. This way, a string of 10000 characters will likely be cut into 6 parts, all ending with a . What would be the best way to get this done?
  5. I really don't see your problem ... It's easy as 1..2..3, but I guess I'm missing something
  6. In your DB you just add a field with parent_id. This refers to the row id of the page which linked to this page. This way you can make as many subcategories as you want. You could either check if a page already exists in the DB (cross links) and then ignore that page, or you could store those cross links as well, depending on what you're planning to do with the information you're gathering.
  7. Seriously?? I can still login to your cPanel as of 2 seconds ago ...
  8. Little advise, change your password right away, as I can log into your cpanel with it ...
  9. Did you honestly just put your DB username and PW here!? Have you tried running the SQL quiry through say phpmyadmin? You might have an error (like a wrong field name?) Also try adding `` round your field names in the quiry, and remove the '' around $ip just a few ideas that probably won't solve the problem! Good luck
  10. I honestly don't really see the point of first putting all the content in the array and afterwards inserting it into the DB. it's like doing the same thing twice. Just make a simple table in mysql (probably only 4 columns required: id, source/url, parent_id, content) and insert a new row for each page your bot visits. This will give tons of possibilities of visualising it (right away, or later on) as well as processing the data. If you want to go with the array plan, I'm not sure how it's done, and that's why I'd personally go with the DB solution ;-)
  11. Ken, You offered me more than I asked! Does exactly what I needed! It's not a built in function, but don't think that really matters if I don't even have to write it myself!!! Thanks! Stijn
  12. Wouldn't it be easier to dump all these contents into a mysql table and reference each to the parent's ID? This way you could regenerate a nice tree, and store the scraped contents for later use
  13. I could probably use explode and put everything together except the last part ... but I'm pretty sure there's a built-in function for this ...
  14. What I want to do is simple: The string "this_is_an_example.php" should return "this_is_an_" while: "this_is_an_example_filename.php should return "this_is_an_example_" Is there a simple function which can cut off the part after the last _ ? I have been looking and trying several functions, but can't seem to find the right one for this. (the last _ being cut off of course wouldn't be a problem I couldn't live with :-) ) Any help will be greatly appreciated!
  15. Thanks for the tip. I'm not sure why I didn't think of it sooner, but I just used cURL, as I stated in an earlier post ... i was using a class provided by the API provider, and did not really bother looking into it. Problem solved :-) Thanks for the help!
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.