Jump to content


  • Posts

  • Joined

  • Last visited

Profile Information

  • Gender
    Not Telling

fivestringsurf's Achievements

Regular Member

Regular Member (3/5)



  1. I'm looking to spend less time managing a server. I don't enjoy it... no, I hate it. It seems every time I have to upgrade linux and/or PHP on my DigitalOcean server I spend no less than an entire weekend figuring out how to fix things. I've been researching the "managed" server solutions for the past few weeks but my head is surely spinning. Each time I check out the services offered or read some reviews it seems I discover yet another contender which boasts its dominance. I was hoping some of you could point me in the right direction. Currently I have 3 live websites and 3 sub-domains hosted on digtalOcean's $5/month plan. They are fairly-low traffic and run flawlessly. I like DigitalOcean. I use bitBucket to deploy. If I could settle on a server management service I would consider moving several other larger websites from other hosting over to the DigitalOcean. One being an old client site, but most are 15 years worth of my own projects. So far I've looked at: Forge Cloudways Server Pilot Moss getCleaver (this is so cheap, can I trust it?) I Cannot Decide! I am looking for a plan that: includes auto backups (at least weekly) provides server stats (realtime is ideal) auto notifies when there is a problem (text or email) is Laravel,WordPress friendly makes CDN integration easy handles Git handles Staging servers well Any advice, wisdom, or experience to share? Thanks!
  2. Oh that's good to know Locally I was using the apache that ships with MacOS. I'm currently running the local Laravel Server (via: ) but I also remember this was the exact case when running xampp. But as long as the actual LAMP stack on the server can handle concurrent requests I don't think I have much to worry about. (as an aside) I tried looking up this on google but found it really hard to even word it correctly to get the info I was seeking! Thank you
  3. Yes and yes. I've already broken the "tasks" down into never more than a single http request if there is one. But when I go to complete a bunch of tasks they can all stack up and take a while. The "few minutes" was me deliberately making a task take longer so I could test my theory locally about php locking up for other's requests...which is still a question I have: When php is processing does this mean other people can't connect to the server? (They will be queued until the process I am running finishes) ?
  4. Hi, I am doing some scraping and processing in php that is time intensive. On my local machine it was no big deal, but now that the project is live and has realtime users daily I have some concerns. When php is processing does this mean other people can't connect to the server? (They will be queued until the process I am running finishes) ? I hope not, but my local testing seems like it IS that way. I did this: I ran a php script from my browser that took a few minutes to execute. While it was running I tried accessing the same local site via another browser tab and it halted. If this is in fact how php works on the live server, how would I go about running about 6000 processes daily that would consume ~1-4 hours of processing time? Thanks.
  5. I have a digitalocean account and would like to be able to auto deploy code to the digitalocean server after a push from my local machine. I've been using beanstalkapp to do this with one of my projects. I love beanstalkapp ... it's easy to use, and well documented. Not being a sys-admin this is important to me. The problem is they only allow 1 free repo and I need another for this charitable app I'm making for my school. Is there any other free repo solution I can use to push code to a staging / production server (at digitalocean) Please keep in mind I'm not a sys-admin so it has to be simple like beanstalkapp. Thanks!
  6. thanks for the reply...the notification of your post got spammed and i just found it by chance.... Interesting approach. Similar to mine from the start but i tried your idea out just for haha's... The weird part is I don't get the usual "normal distribution" when i run large tests %68, %27, etc... I get results near %48, %27, %17, %7. ? strange. any thoughts? here's my code: <?php $rand = mt_rand(0,100); $ave =0; $pickAndSave = array(); for ($n=0; $n<10000; $n++){ $tot = 0; for ($i=0; $i<2; $i++){ $rand = mt_rand(-4,4); $tot += $rand; } $ave = $tot/2; $pickAndSave[] = $ave; } //will hold tally $sd1 = 0; $sd2 = 0; $sd3 = 0; $sd4 = 0; //tally results foreach($pickAndSave as $val){ if ($val < -3 || $val > 3){ $sd4++; }elseif($val < -2 || $val > 2){ $sd3++; }elseif($val < -1|| $val > 1){ $sd2++; }elseif($val <= 0 || $val >= 0){ $sd1++; }else{ echo 'error: number out of bounds!'; } } $tot = $sd4 + $sd3 + $sd2 +$sd1; //print results echo '<pre>'; echo "Report:"; echo "\n $sd1 ". $sd1/$tot *100; echo "\n $sd2 ". $sd2/$tot *100; echo "\n $sd3 ". $sd3/$tot *100; echo "\n $sd4 ". $sd4/$tot *100; ?> Report: 4817 48.17 2688 26.88 1744 17.44 751 7.51
  7. Hi all, I'm not sure if there is a super lean way to randomly generate standard deviations based on normal distribution (bell curve), but if there is ...I'm all ears, uh eyes that is:). Ok- normal distribution; you know; the law that states: there is a %68.3 percent chance of a number (standard deviation) landing in between and including -1 to 1... a %27.2 chance for greater than 1 less than and including 2 or less than -1 greater than and including -2...and so on. When it comes to math and coding I usually get the job done but in my own way. I have found a solution but it involves writing one large array [1000 entries long] on the fly each time and consists of randomly generated values that fit the normal distribution model. When I run tests, my results confirm it does work and as expected- the larger the data pool the closer the results get to a precise normal distribution (law of averages at work! flip a coin a million times and you get closer to 50/50) Can't i use some nifty formula to randomly produce numbers through out the normal distribution curve? As it stands now I am running several loops to build the master list of normally distributed numbers before I even use them....seems like some overhead here??? Please advise, thank you.
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.