Jump to content

RussellReal

Members
  • Posts

    1,773
  • Joined

  • Last visited

Posts posted by RussellReal

  1. you could echo the names like so:

     

    choices[questionNumber][1]

    choices[questionNumber][2]

    choices[questionNumber][3]

    choices[questionNumber][4]

     

    for example..

     

    while ($row = mysql_fetch_assoc($q)) {
      echo "$intNumber, {$info['Que_Question']} <br />\n";
      for ($i =1; $i < 5; $i++) {
        echo "<input type=\"checkbox\" name=\"choices[{$row['id']}][{$i}]\" />{$row['Que_Choice'.$i]}";
      }
    }

     

    and pull answers from $_POST[choices] by id $_POST[choices][idNumber] :)

  2. #1 your question is very vague,

    #2 mysql_fetch_array is more or less for looping numerically through a resultset, you're using mysql_fetch_array the way you would use mysql_fetch_assoc, you're pulling duplicate entries to the array for no reason with mysql_fetch_array.. but it really doesn't matter..

     

    Bottom line, please rephrase your question :)

  3. haha could be done in CSS Completely like HAKU said, I'll show you an example (without CSS:background-image hacks or CSS:background-position hacks)

     

    <style type="text/css">
    a img { border: none; }
    a .dn { display: none; }
    a:hover .dn { display: block; }
    a:hover .up { display: none; }
    </style>
    <a href="link.html">
    <img class='up' src="linkUp.gif" />
    <img class='dn' src="linkDn.gif" />
    </a>

     

    oh and to answer your question.. You're most likely better off using .src as setAttribute might not be supported by all browsers

     

  4. First, let me start by saying I'm not a porn site or a hacker.. So please no moral debate, I will explain exactly what I want it for in a moment.

     

    Basically what I need is something that will listen for when a page starts loading, to cancel the loading and then load up a different page :)

     

    Basically I have an iFrame which points to someone else's Java applet, I don't take credit away from them at all. That java applet if you click the close button, it will redirect the browser's top.location, instead of the frame that it is in.. Which then redirects the users away from my site..

     

    I simply want to keep them on MY site until they navigate away from my page manually or if they close the browser.

     

    thanks :)

  5. Ok, let me draw up a scenario for you guys :)

     

    My site's code: lala.com

    <html><body>
      <iframe src="lol.com"></iframe>
    </body></html>

     

    lol.com's code:

    <html><body>
      <iframe src="random url"></iframe>
    </body></html>
    

     

    from lala.com I want to get lol.com's random url from their iframe.. I DO NOT WANT TO MODIFY THE DOM. BUT IF I MUST ITS OK :P

     

    I know that frames[0][0].location WILL WORK in internet explorer, but not in any other browser..

     

    I need a cross browser solution, this feature is instrumental and needs immediate attention :). THANKS!

  6. this can be done fairly easy.. but you need to make the inputs and stuff..  then the submit button onclick will gather all the information and trigger a parent function

     

    so you'd do something like this

     

    window.parent.formSubmit(field1,field2,field3,field4);

     

    and then in formSubmit, that will trigger the form to submit, after you make the form elements inside the form.. then set the values to those specific fields inside the parent's form..

     

    because the fields inside the iframe will never be detectible by the form on the parents page..

     

    so you'd need to manually send the data from the child to the parent like I said above..

     

    goodluck :)

  7. According to the tech support people, the reason larger uploads (> 2MB) would fail was due to either a timeout on how long the script was allowed to run or as pointed out, a memory limitation.

     

    So would moving to a VPS very likely fix this type of error?

     

    The reason I ask this is because, if I understand correctly, with a VPS one still shares the cpu and memory w/ an unknown # of other applications and therefore my app is contending for resources. My guess is that even with "busy" neighbors my uploads that were in effect being yanked would still proceed, but just take longer... correct?

     

    Thanks for responding!

     

     

    the only resource which is shared on a vps is the bandwidth..

     

    on a vps you get ram, and then you get what they call burstable ram..

     

    ram is what your account has dedicated.. so you will never be disallowed that much ram.. burstable ram is how much ram you could be bumped up to if other people aren't using as much RAM and your applications require more.. but that is your absolute max... and there is no guarantee to ever be able to touch your burstable ram limit..

     

    most vps also give you a certain cpu percentage which you're dedicated, not sure if they have it set up for burstable also..

     

    the only thing that is shared is the incoming and outgoing bandwidth

  8.  

    Today I got GoDaddy tech support to replicate the error. Their explanation was that the script stops (or is terminated?) due to a setting in the shared hosting environment and that moving to a virtual dedicated server 'might' correct it. Not sure if that makes sense since it seems that this is a resource issue - but I don't know much about shared hosting vs VPS.

     

    At that point (I'd been on the phone with them for over an hour) I didn't remember this article - http://help.godaddy.com/article/1475? which to me contradicts what I was told. Oh well, guess I'll call them back.

     

    BTW, I did try bumping up the memory and get_memory_usage function(true) ...but without success.

     

    if you're going to move to a VPS there are alot better VPS servers out there for less money than godaddy charges, I currently use WestHost (no this isn't a solicitation, no referral links here :P, just a friendly suggestion!)

     

    - Russell

  9. Hi

     

    I know, just that it seems wrong. The chances of an issue are small but an easily returned date of last update would mean effectively zero chance. But a trigger means a load of extra hidden updates.

     

    All the best

     

    Keith

     

    seems to only need 1 hidden update :)I doubt it would be any less effective than if it updated it the way it should on windows.. the only difference is it will make the script you're writing less portable.. but thats not a huge problem.

  10. mm ur formatting is horrid

    echo "<tr>
    <td class='trow2' align='center' valign='center' width='1'></td>
    <td class='trow2' valign='center'>
    <strong><a href='server.php?view=details&id=" . $list['id'] . "'>" . capitalizeFirstCharacter($list['servername']) . "</a></strong>
    <div class=\"stat2\">
    <div class=\"stat\">";
    if(!$sock=@fsockopen($list_f['serverip'],$list_f['serverport'], $num, $error, 1)) {
    echo "<font color='red'><b>Offline</b></font>";
    } else {
    echo "<font color='green'><b>Online</b></font>";
    }
    echo "</div>
    </div>
    <div class='smalltext'>" . $list['shortdescription'] . "</div>
    </td>
    <td class='trow1' valign='middle' align='left' style='white-space: nowrap'><span class='smalltext'>" . $list['revision'] . "</span></td>
    <td class='trow2' valign='middle' align='right' style='white-space: nowrap'><font size='4px'>" . $voteAmount . " Votes</font></td>
    </tr>";
    

     

    try that

     

     

  11. using a perm table for results than dropping results and adding results will add alot of overhead to that table, and will be alot of heavy operations.. I really doubt google uses mysql for their search engine..

     

    Their search engine is really robust, and mysql has its limitations..temporary tables will speed up your results ten fold..

     

    I was working on this project with this pretty cool group of people.. and they had like over 40million results in their database and they wanted to be able to index all of these in a seach and also add commenting functionality, and no matter how bad we tried to get the comments and the like working across 40 million results with just normal queries.. the queries were all over 1 second large with comment calculations and stuff, not to mention scanning 3 tables for data.. but then incorporating all of this information and cutting away results from the temp table.. instead of filtering directly from the result set (you know there were basic filters) but most of the filtering happened in the temp table.. and it sped it up dramatically to under 150ms/query on a good amount of server load..

     

    temp tables are great because they are exclusively for that session with mysql, and aren't stored in the same way as permanent tables are.. so they're alot faster to work with.. also you can index them specifically for your application instead of working with indexs that are specific for the project as a whole..

     

    I would look into temporary tables for sure :)

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.