Jump to content

fivestringsurf

Members
  • Posts

    96
  • Joined

  • Last visited

Profile Information

  • Gender
    Not Telling

fivestringsurf's Achievements

Regular Member

Regular Member (3/5)

0

Reputation

  1. I'm looking to spend less time managing a server. I don't enjoy it... no, I hate it. It seems every time I have to upgrade linux and/or PHP on my DigitalOcean server I spend no less than an entire weekend figuring out how to fix things. I've been researching the "managed" server solutions for the past few weeks but my head is surely spinning. Each time I check out the services offered or read some reviews it seems I discover yet another contender which boasts its dominance. I was hoping some of you could point me in the right direction. Currently I have 3 live websites and 3 sub-domains hosted on digtalOcean's $5/month plan. They are fairly-low traffic and run flawlessly. I like DigitalOcean. I use bitBucket to deploy. If I could settle on a server management service I would consider moving several other larger websites from other hosting over to the DigitalOcean. One being an old client site, but most are 15 years worth of my own projects. So far I've looked at: Forge Cloudways Server Pilot Moss getCleaver (this is so cheap, can I trust it?) I Cannot Decide! I am looking for a plan that: includes auto backups (at least weekly) provides server stats (realtime is ideal) auto notifies when there is a problem (text or email) is Laravel,WordPress friendly makes CDN integration easy handles Git handles Staging servers well Any advice, wisdom, or experience to share? Thanks!
  2. Oh that's good to know Locally I was using the apache that ships with MacOS. I'm currently running the local Laravel Server (via: http://127.0.0.1:8000 ) but I also remember this was the exact case when running xampp. But as long as the actual LAMP stack on the server can handle concurrent requests I don't think I have much to worry about. (as an aside) I tried looking up this on google but found it really hard to even word it correctly to get the info I was seeking! Thank you
  3. Yes and yes. I've already broken the "tasks" down into never more than a single http request if there is one. But when I go to complete a bunch of tasks they can all stack up and take a while. The "few minutes" was me deliberately making a task take longer so I could test my theory locally about php locking up for other's requests...which is still a question I have: When php is processing does this mean other people can't connect to the server? (They will be queued until the process I am running finishes) ?
  4. Hi, I am doing some scraping and processing in php that is time intensive. On my local machine it was no big deal, but now that the project is live and has realtime users daily I have some concerns. When php is processing does this mean other people can't connect to the server? (They will be queued until the process I am running finishes) ? I hope not, but my local testing seems like it IS that way. I did this: I ran a php script from my browser that took a few minutes to execute. While it was running I tried accessing the same local site via another browser tab and it halted. If this is in fact how php works on the live server, how would I go about running about 6000 processes daily that would consume ~1-4 hours of processing time? Thanks.
  5. Ever have one of those problems that makes no sense? Yep that's me for the past 2 days... I've had apache/php stack set up on my osx machine for at least 6 years with little to no problems. I had all of my v-hosts for local sites under /etc/apache2/other (they were included via httpd.conf) All of a sudden my local sites stopped working and after tracking down the problem by using: >sudo apachectl -t I found out that my v-hosts where no longer there... heck the whole directory /etc/apache2/other was missing. Now I'm quite sure I did not touch this...the question remains... what the hell happened? Now I'm not foolish enough to think someone is going to be able to tell me what happened precisely but some insight would be nice: Has this happened to any of you before? Did osx auto update something and wipe things out? I mean ... I didn't update to a major release... still on El Capitan. Was it an automatic xcode update? I'm very baffled right now.
  6. So I did get it working thanks to all the helpful minds here in this forum. @Jacques1 ob_flush() was key! It was really difficult to wrap my mind around the solution because eventSource wasn't as easy to work with as ajax. eventSource expects a very specific (sort of bizarre) returning structure and if even one line ending is off it doesn't work. I also couldn't grasp how to upload the file and then listen for it because you can't send files with eventSource so I couldn't get eventSource to listen in on the file upload progress. But that wasn't the biggest deal...I just used my normal ajax-style upload function with the XMLHttpRequest.progressHandler thingee to do the work. Here's what I did: Upload the file to import.php using ajax and display the progress (pretty straight forward stuff) As soon as the file is done uploading to import.php I log it to the database and generate a hash I send back the hash with the returning json back to the ajax script that started it all I immediately call eventSource to start listening in on a separate script that lives at import_progress.php (I used ?url_hash=123abc in url to pass the hash) I don't think eventSource is meant to pass vars... I was trying to be clever import_progress.php checks the db based on the hash and starts processing. Each time the processing gets through a loop it increments an (int)progress field forward in the Database and immediately echos the progress out followed by ob_flush(); flush(); Meanwhile back on the client side we're listening to the echos and manipulating a progress bar Maybe it's just me but I really felt like I stretched the technologies, PHP in particular to the limit here in forcing it to behave in a way it was never designed. Passing the $_GET variable in step 4 felt a bit janky but I didn't know any other way to do it. Once eventSource is called it has no knowledge of what has been uploaded so this was the only way I found to do it and it can't monitor the ajax upload as far as I know. EventSource is kind of dangerous, it keeps calling the script. One time I wasn't paying attention and images kept on getting created...I can only imagine if I decided to go to bed and not fix that or at least close the browser - yikes. I'm going to have to go through my image processing classes and craft some very clever fail safes so EventSource doesn't get hung up. Maybe I can even time it out on the client side if no progress is being made after a certain time period... We'll see. I've won this battle but there's much to do.
  7. @kicken so I tried some code with fastcgi_finish_request() and unfortunately I got this: Fatal error: Uncaught Error: Call to undefined function fastcgi_finish_request() So I'm sure it's some apache mod I'm missing. I looked into it and I think getting that going is above my pay grade...it looks complicated and the more I read I discovered that there can be issues with logging. hmmm It's late but I think what I might try tomorrow is a 3 prong approach to keep all 3 phases of the script separate. Here's what I'm thinking: Upload file and report back progress (using ajax or this EventSource thing) Once complete, send a second call to server to start the processing and don't bother listening for a returning msg Now start polling the server to "listen" on it's progress (the processing will update DB on progress) It's what I have in my head anyway... I'll try it tomorrow.
  8. @Jaques1 I set up a test environment and ran your code. Interesting idea but here's what happens: It works (kinda) but it throws all results back at once. For instance after loading the page there is no response from server and then after 100 seconds it all shows up in the console. Then after 100 seconds it does the same thing again. I can confirm that this is the output/behavior in both ff and chrome Not sure if this is a limitation of my server environment. I'm running php7 on OSX (my local testing rig)
  9. @kicken, I think the only part I was missing is the cron job, because what you described is precisely what I built. running cron every minute? would that be intensive on the server? or is this a routine kind of normalcy one can expect? @Jaques1, server-events? hmmm that seems enticing. but would php be able to echo out progress (ie: json ) while in the middle of processing? I thought once php is processing nothing can be echoed out until it's complete? Please clarify if I'm wrong because that could be a game-changer indeed. An exception of course would be monitoring the file upload progress. @Psycho - I incorrectly described the situation, my fault. The browser isn't locking up of course as it's an asynchronous call. What is happening is the return response is hanging up until all the processing is completed. Even if I do this: $uploadfiles(); echo 'success, hash=123'; $processImages(); Even though the echo is before the processing directive...it never get's sent until the entire script is completed. So I believe I have to separate the workflow into 2 scripts called separately.
  10. I built a "bulk importer" that takes a .zip file filled with images and a corresponding csv file that holds attributes. I'm happily using some JavaScript to provide upload-progress feedback to the user. So if the .zip file is say 10mb... they are seeing it's upload progress. (im using AJAX) This is all working nicely BUT... Once the .zip hits the server I need to do A TON of processing. Each image has to be converted into 10 different sizes, cropped, etc... All entries must be entered into the Database and admin text logs created. All of this actually works just fine for small files <10mb and I'm sure it could work with bigger files by increasing timeout time,etc... BUT the browser "locks up" during processing and there is no real way to inform the user about the progress of their files being processed. I thought maybe I could be clever and create a "progress table" in the db... and use it like this: As soon as the .zip file is uploaded to the server I create a row and an id. Next I send that id back to the browser (AJAX) and immediately start the laborious processing. The processing would continually update the DB with it's progress. The js would receive the id and keep polling the DB to check on the processing progress and ultimately report this back to the user. Well my brilliant scheme doesn't seem to work and everything locks up regardless. I think I was trying to fake multi-threading and I'm not sure how to solve this problem. My end goal is to crunch huge files and keep the user notified of it's progress - Does anyone have good advice?
  11. sweet thanks for your advice and sample code!
  12. I spent the afternoon playing around with node.js tuts because node.js / NPM seems to be requirement for all these newfangled front-end managers... but I quickly learned that node.js IS it's own server language to be used instead of php. That abruptly ended my "delving". @kicken - do you use these tools specifically with PHP. (are you using bowerPHP ?)
  13. Over the past year I started using composer and have realized that using a dependency manager keeps development and code maintenance so much easier. yay composer! I can see an equally big need to do this for front side technologies (ie: js & css) What exists for us PHP developers to help maintain that huge mess of front end stuff we need to include. Is there something to manage and minify JS/CSS that works well with the PHP environment? Thanks
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.