Jump to content

dalecosp

Members
  • Posts

    471
  • Joined

  • Last visited

  • Days Won

    8

Posts posted by dalecosp

  1. Can you set up a VM "dev box" on a laptop and take your projects with you?

    My employer didn't have many "code gurus" and wasn't too interesting in seeing the code ... they wanted to look at what the code produced.

    Dev sites/projects worked well for that.  I set them up on a private VM and took the laptop to the interview.  It seems to have worked.... :)

  2. Well, the issue is more with the time than the processor, as a curl call isn't really that processor intensive. It's mostly just waiting on the internet to send the page.

    Yes, I suppose if all you're doing is d'loading a page, it's not. We do a lot of page parsing after the download, in VM's, and CPU does become an issue, particularly if we use threaded workers for the task(s).

     

    You'd just setup some means of communicating with the script so you can tell it what URL's to download or any other actions you need it to do.

    Might be fun to run as a socket server in a "while(1)" loop....

  3. If you have a URL, you can use something as simple as file_get_contents() to get page content. There are some quid pro quos --- PHP configuration must be set to allow this (see allow_url_fopen, I think).

     

    The next option is cURL, which is often used for this sort of this.

     

    Depending on your environment, there may be external programs that could be leveraged for this (for example, many 'Nix environments have the lynx browser installed, which can be called with "-dump" to give you a code dump of a page.

     

    Otherwise you're probably reduced to writing something yourself using socket functions.

     

    HTH,

  4. As time required for processing is also your issue, I guess you could fork, or you could just write multiple handlers and have the UI call them all. I'm not sure if it matters; I guess the forking might be better if it's less front-end work. Either way, you are going to end up binding up 100% CPU, I'd imagine, unless you have significant machine resources or the app runs in a cluster/cloud.

     

    This stuff ain't too easy, is it? :)

  5. I'm not sure I know enough about your system and requirements to say definitely "yea" or "nay" on that.

     

    Here's how one of my apps works.

     

    1. The UI page is loaded.

    2. The user picks a task.

    3. Pressing the control for this task activates JS that makes an AJAX call to a PHP handler.

     

    Here's the handler, more or less:

    <?php
    
    //doit.php --- For a RESTful home page, this will call one of the *_doit.php files with appropriate GET string.
    
    $who = $_GET['doit'];
    
    $doitfile="doits/".$who."_doit.php";
    
    echo "Running $doitfile\n"; //for the AJAX; this subsystem needs improved thought given to it.
    
    include "$doitfile";
    
    ?>

     

    The script's echo output (see above comment) is returned and JS in the UI reads this into a "status" DIV. The $doitfile follows this same logic, echoing needed information when finished which also gets returned to the status div. The $doitfiles take anywhere from a couple minutes to a day or more to run, depending on what they're doing (and to $whom).

     

    Cron isn't involved. I've not forked anything either, but I've considered it (we currently have to run a big job or two over the weekend because there's no point in the user doing nothing for two days) ;) ;)

     

    Hope this helps,

  6. D'oh, too true!

     

    I need to ask the boss for more coal in the stove ... brain's apparently frozen.

     

    That would be the reason I have this:

     

    function mysqli_result($result,$row,$field) {
    if (@$result->num_rows==0) return 'unknown';
    $result->data_seek($row);
    $one=$result->fetch_assoc();
    $ret=$one[$field];
    return trim($ret);
    }

  7. Reading code is important, though; try not to feel overwhelmed.

     

    Grab a section of code, copy it into your IDE/editor, and add comments as you figure out what each line is doing.

     

    Then grab the next section and do the same.

     

    Once you've got a big section annotated in this way, go back and read over all *your* comments. It can be very enlightening and helpful.

  8. When you call "mysql_fetch_assoc", you're asking for an associative array, which is what you're getting.

     

    This would work with the old mysql_ functions you're using:

     

    $result = mysql_result(mysql_query("SELECT COUNT(*) FROM Draws"));

     

    You really *should* be using mysqli_ instead:

     

    $con = mysqli_connect($hostname, $username, $password, $database) or die("Unable to connect to MySQL");
    
    
    $result = mysqli_query($con,"SELECT COUNT(*) FROM Draws");
    $num_rows = $result->num_rows;
    
    echo "There are $num_rows results from your query.";

  9. Are we talking user interface here?

    This sentence:

     

     

     

    how would I go about ensuring nobody could choose a box already chosen?


    sounds a bit like that; and for web forms these days it's usually handled in Javascript (then double-checked in {PHP} after submission so it doesn't screw up the server).

    As to maximum values to store in an array ... I suppose there's a working limit.  Do you have an alternate plan for storing these values?  The thing to do would be test both implementations to see if there was any performance hit/gain from one or more of the proposed storage solutions.

  10. The comments on the manual page for curl_setopt include this:

     

    <?php
    function _curl_parse_cookiefile($file) {
    $aCookies = array();
    $aLines = file($file);
    foreach($aLines as $line){
    if('#'==$line{0})
    continue;
    $arr = explode("\t", $line);
    if(isset($arr[5]) && isset($arr[6]))
    $aCookies[$arr[5]] = $arr[6];
    }
    return $aCookies;
    }
    ?>

     

    Props to "prohfesor@gmail" ...

  11. I've done some for various different forums in the past. An easy way to handle it is to just work out what the forms need for data (like the login form, or registration form) and then simply send that data to the URL that normally processes it in the forum. If you're lucky, the forum will already be able to handle AJAX requests or something, which makes this even easier.

     

    It depends how deeply you need to integrate it.

    True, and thanks.  cURL can sometimes be leveraged in this use case, as well, as you imply.

     

    The really knotty parts are things like cookie management (you'll typically have two you need to keep track of), and session management, which for some gadawful reason lots of people like to stick in the DB.

  12. These are sometimes called "bridge" addons/modules.  You might look for one of those already written.

    I've written my own bridge between a site and vBulletin.  Not for the timid, but if you've got some experience you should be able to hack something together fairly quickly.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.