Jump to content

fivestringsurf

Members
  • Posts

    96
  • Joined

  • Last visited

Posts posted by fivestringsurf

  1. I'm looking to spend less time managing a server.  I don't enjoy it... no, I hate it.  
    It seems every time I have to upgrade linux and/or PHP on my DigitalOcean server I spend no less than an entire weekend figuring out how to fix things.

    I've been researching the "managed" server solutions for the past few weeks but my head is surely spinning.  Each time I check out the services offered or read some reviews it seems I discover yet another contender which boasts its dominance.

    I was hoping some of you could point me in the right direction.

    Currently I have 3 live websites and 3 sub-domains hosted on digtalOcean's $5/month plan.  They are fairly-low traffic and run flawlessly.  I like DigitalOcean.  I use bitBucket to deploy.

    If I could settle on a server management service I would consider moving several other larger websites from other hosting over to the DigitalOcean.  One being an old client site, but most are 15 years worth of my own projects.

    So far I've looked at:

    • Forge
    • Cloudways
    • Server Pilot
    • Moss
    • getCleaver (this is so cheap, can I trust it?)

    I Cannot Decide!

    I am looking for a plan that:

    • includes auto backups (at least weekly)
    • provides server stats (realtime is ideal)
    • auto notifies when there is a problem (text or email)
    • is Laravel,WordPress friendly
    • makes CDN integration easy
    • handles Git
    • handles Staging servers well

    Any advice, wisdom, or experience to share?

    Thanks!
     

  2. 3 hours ago, requinix said:

    Not even remotely the case.

    Locally, were you using the built-in server that PHP provides? Don't. It's good for quick stuff but it's not a real server. Set up your development environment to match your production environment as closely as possible.

    Oh that's good to know :)

    Locally I was using the apache that ships with MacOS.  I'm currently running the local Laravel Server (via: http://127.0.0.1:8000 )  but I also remember this was the exact case when running xampp.

    But as long as the actual LAMP stack on the server can handle concurrent requests I don't think I have much to worry about.

    (as an aside) I tried looking up this on google but found it really hard to even word it correctly to get the info I was seeking!

    Thank you

  3. 12 hours ago, ginerjm said:

    Perhaps you need to examine your long-running process for better ways to perform whatever task you are doing.   Are you using repetitive queries instead of one over-all one? Are you looping with an embedded query?  These are things to avoid.  A process that takes "a few minutes" to execute is either working on hundreds of thousands of records or is not written properly.

    Yes and yes.  I've already broken the "tasks" down into never more than a single http request if there is one.  But when I go to complete a bunch of tasks they can all stack up and take a while.

    The "few minutes" was me deliberately making a task take longer so I could test my theory locally about php locking up for other's requests...which is still a question I have:

    When php is processing does this mean other people can't connect to the server? (They will be queued until the process I am running finishes) ?

  4. Hi,

    I am doing some scraping and processing in php that is time intensive.  On my local machine it was no big deal, but now that the project is live and has realtime users daily I have some concerns.

    When php is processing does this mean other people can't connect to the server? (They will be queued until the process I am running finishes) ?

    I hope not, but my local testing seems like it IS that way.  

    I did this:

    I ran a php script from my browser that took a few minutes to execute.  While it was running I tried accessing the same local site via another browser tab and it halted.

    If this is in fact how php works on the live server, how would I go about running about 6000 processes daily that would consume ~1-4 hours of processing time?

     

    Thanks.

     

     

  5. Ever have one of those problems that makes no sense?  Yep that's me for the past 2 days...

     

    I've had apache/php stack set up on my osx machine for at least 6 years with little to no problems.

    I had all of my v-hosts for local sites  under   /etc/apache2/other  (they were included via httpd.conf)

     

    All of a sudden my local sites stopped working and after tracking down the problem by using:

    >sudo apachectl -t

     

    I found out that my v-hosts where no longer there... heck the whole directory /etc/apache2/other was missing.

     

    Now I'm quite sure I did not touch this...the question remains... what the hell happened?

     

    Now I'm not foolish enough to think someone is going to be able to tell me what happened precisely but some insight would be nice:

     

    Has this happened to any of you before?  Did osx auto update something and wipe things out?   I mean ... I didn't update to a major release... still on El Capitan.

    Was it an automatic xcode update?

     

    I'm very baffled right now.

  6. So I did get it working thanks to all the helpful minds here in this forum. @Jacques1 ob_flush() was key!
    It was really difficult to wrap my mind around the solution because eventSource wasn't as easy to work with as ajax.  eventSource expects a very specific (sort of bizarre) returning structure and if even one line ending is off it doesn't work.
     
    I also couldn't grasp how to upload the file and then listen for it because you can't send files with eventSource so I couldn't get eventSource to listen in on the file upload progress. But that wasn't the biggest deal...I just used my normal ajax-style upload function with the XMLHttpRequest.progressHandler thingee to do the work.
     
    Here's what I did:

    • Upload the file to import.php using ajax and display the progress (pretty straight forward stuff)
    • As soon as the file is done uploading to import.php I log it to the database and generate a hash
    • I send back the hash with the returning json back to the ajax script that started it all
    • I immediately call eventSource to start listening in on a separate script that lives at import_progress.php  (I used ?url_hash=123abc in url to pass the hash)  I don't think eventSource is meant to pass vars... I was trying to be clever
    • import_progress.php checks the db based on the hash and starts processing.  
    • Each time the processing gets through a loop  it increments an (int)progress field forward in the Database and immediately echos the progress out followed by ob_flush(); flush();
    • Meanwhile back on the client side we're listening to the echos and manipulating a progress bar

    Maybe it's just me but I really felt like I stretched the technologies, PHP in particular to the limit here in forcing it to behave in a way it was never designed.  Passing the $_GET variable in step 4 felt a bit janky but I didn't know any other way to do it.  Once eventSource is called it has no knowledge of what has been uploaded so this was the only way I found to do it and it can't monitor the ajax upload as far as I know.

     

    EventSource is kind of dangerous, it keeps calling the script.  One time I wasn't paying attention and images kept on getting created...I can only imagine if I decided to go to bed and not fix that or at least close the browser - yikes.  

     

    I'm going to have to go through my image processing classes and craft some very clever fail safes so EventSource doesn't get hung up.  Maybe I can even time it out on the client side if no progress is being made after a certain time period...  We'll see.   I've won this battle but there's much to do.

  7. @kicken  so I tried some code with   fastcgi_finish_request() and unfortunately I got this:

    Fatal error: Uncaught Error: Call to undefined function fastcgi_finish_request()
    

    So I'm sure it's some apache mod I'm missing. I looked into it and I think getting that going is above my pay grade...it looks complicated and the more I read I discovered that there can be issues with logging.   hmmm
    It's late but I think what I might try tomorrow is a 3 prong approach to keep all 3 phases of the script separate.  Here's what I'm thinking:

     

    1. Upload file and report back progress  (using ajax or this EventSource thing)
    2. Once complete, send a second call to server to start the processing and don't bother listening for a returning msg
    3. Now start polling the server to "listen" on it's progress  (the processing will update DB on progress)

    It's what I have in my head anyway... I'll try it tomorrow.

  8. @Jaques1  I set up a test environment and ran your code.  Interesting idea but here's what happens:  It works (kinda) but it throws all results back at once.

     

    For instance after loading the page there is no response from server and then after 100 seconds it all shows up in the console.  Then after 100 seconds it does the same thing again.  I can confirm that this is the output/behavior in both ff and chrome Not sure if this is a limitation of my server environment.  I'm running php7 on OSX (my local testing rig) 

  9. @kicken,  I think the only part I was missing is the cron job, because what you described is precisely what I built.    running cron every minute? would that be intensive on the server?  or is this a routine kind of normalcy one can expect?

     

    @Jaques1, server-events?  hmmm that seems enticing.  but would php be able to echo out progress (ie: json ) while in the middle of processing?  I thought once php is processing nothing can be echoed out until it's complete?  Please clarify if I'm wrong because that could be a game-changer indeed.  An exception of course would be monitoring the file upload progress.

     

    @Psycho  - I incorrectly described the situation, my fault.  The browser isn't locking up of course as it's an asynchronous call.  What is happening is the return response is hanging up until all the processing is completed.  Even if I do this:

    $uploadfiles();
    echo 'success, hash=123';
    $processImages();
    

    Even though the echo is before the processing directive...it never get's sent until the entire script is completed.   So I believe I have to separate the workflow into  2 scripts called separately. 

     

  10. I built a "bulk importer" that takes a .zip file filled with images and a corresponding csv file that holds attributes.

    I'm happily using some JavaScript to provide upload-progress feedback to the user.  So if the .zip file is say 10mb... they are seeing it's upload progress. (im using AJAX)

    This is all working nicely BUT...

     

    Once the .zip hits the server I need to do A TON of processing.  Each image has to be converted into 10 different sizes, cropped, etc...

    All entries must be entered into the Database and admin text logs created.

    All of this actually works just fine for small files <10mb and I'm sure it could work with bigger files by increasing timeout time,etc...

     

    BUT the browser "locks up" during processing and there is no real way to inform the user about the progress of their files being processed.

     

    I thought maybe I could be clever and create a "progress table" in the db... and use it like this:

    1. As soon as the .zip file is uploaded to the server I create a row and an id.
    2. Next I send that id back to the browser (AJAX) and immediately start the laborious processing.  The processing would continually update the DB with it's progress.
    3. The js would receive the id and keep polling the DB to check on the processing progress and ultimately report this back to the user.

     

    Well my brilliant scheme doesn't seem to work and everything locks up regardless.  I think I was trying to fake multi-threading and I'm not sure how to solve this problem.  

     

    My end goal is to crunch huge files and keep the user notified of it's progress - Does anyone have good advice?

  11. I spent the afternoon playing around with node.js tuts because node.js / NPM seems to be requirement for all these newfangled front-end managers... but I quickly learned that node.js IS it's own server language to be used instead of php.  That abruptly ended my "delving".  

    @kicken - do you use these tools specifically with PHP.  (are you using bowerPHP ?)

  12. Over the past year I started using composer and have realized that using a dependency manager keeps development and code maintenance so much easier.  yay composer!

    I can see an equally big need to do this for front side technologies (ie:  js & css)

     

    What exists for us PHP developers to help maintain that huge mess of front end stuff we need to include.  Is there something to manage and minify JS/CSS that works well with the PHP environment?

     

    Thanks

     

     

     

  13. I've been using composer and like the idea of having managed libraries/dependencies in php.

    I'm having trouble understanding how to call libraries with autoloading.  Some of the package authors give great directions and some don't.  Things seem very inconsistent which is really annoying.

     

    For example I'm using this image library like this:

    use Intervention\Image\ImageManager;
    $imgMan = new ImageManager(array('driver' => 'gd'));
    //etc...

    Awesome! But I can't figure out how to do something similar with firephp and mpdf (namespacing)?

     

    I figured out that these do work:

    $firephp = FirePHP::getInstance(true);
    $mpdf = new mpdf();

    But, why all the inconsistencies and different ways of doing things?  I'd like to keep everything neat and namespaced all in a similar way.

     

    Can anyone offer some advice?

     

  14. @kicken,

     

    excellent answer.  The reason I was most annoyed by this was when a user looks at their download history if they have been using an android device everything is doubled! 
    I guest the solution is:

    a) screw you android users (you're log activity is doubled for download requests)

    b) don't log requests made by same user/link within <3 seconds of one another

     

    Thanks again for your answer.

  15. I have a php application that serves pdf downloads.  It works fine on all devices and browsers with one small but really annoying side-effect (edge-case for sure)

    When I look at my download logs anytime the download is triggered from Chrome on Android it is called twice!  Bizarre behavior and I can't figure it out.

     

    Some background:

     

    The download is a pdf that get's created on the fly.

    All requests get processed through my index.php controller.

     

    I was serving the request with javascript via:

    window.open('export?file=something_to_inform_the_controller');

    Works great in all browsers and devices but android chrome triggers this twice.

     

    So I got wise and though maybe a direct link would work better:

    <a target="_blank" href="http://mysite.com/export?file=abc123" download="file.pdf">DL link</a>
    or
    <a target="_blank" href="http://mysite.com/export?file=abc123">DL link</a>
    or
    <a target="_self" href="http://mysite.com/export?file=abc123" download="file.pdf">DL link</a>
    or
    <a target="_self" href="http://mysite.com/export?file=abc123">DL link</a>

    Nope, none of these flavors prevents android/chrome from double downloading.

     

    Then I researched my php header settings and tried:

    content-disposition: inline  vs.

    content-disposition: attachment    with no success

     

    Note, the download is logged when the controller processes the request for the download.  I have duplicated download events for all downloads on android/chrome.  It's strange that I have not found a solution online for this or maybe I'm overlooking something silly.

     

    Any ideas?

  16. @kicken,  thanks for your insightful response.  I can tell you've been this path before.

    It's common to combine what would be Staging and Development. 

     

    The only issue is I develop on a mac and naturally the production environment would be a linux so the staging to production environments would be drastically different.

     

    I was thinking that staging would be a subdomain on the production server with access limited by ips.  Do you see any problems with this?

     

    Either way, it's time I get my hands dirty with some more linux / command line stuff.  I think I might go get some cloud hosting for almost free and practice. (there are some free/almost free options now)  And it seems like a pure linux environment with no restrictions.

  17. I've spent the last year building a web application on my local machines using the typical LAMP stack.  I've been a developer for 10+ years and am fairly good when it comes to scripting but the server/hosting/system admin thing scares me.  I've taken tons of sites live but they always exist on shared hosting and require minimal maintenence...simply ftp changes...no big deal.

     

    With my latest personal projects I've used revision control (git or mercurial) simply as a way to let me work from different machines.  It's awesome.  I push code from home, work, and my laptop and everything is in sync with one another.  It really has changed the game for me. ( I use bitbucket)

     

    My latest project will involve paying customers and has a huge code base.  FTPing files is not going to cut it.  I've heard of having a "staging" environment so that you can push code to the staging environment, test it, and then push to production.  That sound perfect!

     

    Every time I google git/staging I get pages and pages of command line stuff.  I'm used to using version control GUI's like tower and sourcetree.   

     

    Are there server environments that would allow me to use a GUI to manage version control?  Or are linux server environments command line only?

     

    Are there any hosting companies you know of that would be a good fit for these needs?  I'm looking to keep the hosting <= $20/month

     

     

    Thanks

  18. @Barand

    Funny, I didn't even know this was a "model"...I was just doing what made sense to me :)   But boy what a nightmare to filter through stuff.

    I figured out I can sort like this:

    ORDER BY field( attribute.name, 'Year' ) DESC , value ASC

    But I can't limit the query in a useful way for pagination and I can't limit the scope of the search as in  WHERE 'year' = 2014 ... because the year only exists as one attribute of many.
    It's such a headache that I'm thinking of a redesign maybe.
    have two tables:

    table: item
    fields: item_id, attr1, attr2, attr3 ...
    
    table: item_meta
    fields: item_meta_id, item_id, data_type, name

    This way i can perform normal operations on a single table and refer to the meta table for the user generated names, datatypes, etc.

    I would just have to limit the amount of attributes a user could define.  so table: item would have up to say attr50  (50 fields)

     

    I wonder if this idea is some type of "model" too?

  19. @Barand, sure I can do that with regular arrangements when I'm sorting on a column/field, but with my db the way it is now; there is never a field to sort on.  For example, look at my sample array.  If I wanted to sort based on "Year" there is not a field labled "Year" in the database.  Instead years are stored in the table: item_attribute under the field:name="Year" and field:value="1985"

    So a row would look like:

    item_attribute_id | item_id | name  | value
    12 |  3  |  Year  |  1985

    Upon further investigation maybe I can I use the ORDER BY FIELD() clause/function ?

     

    And how can I retrieve specific number of rows for pagination if I'm using many rows to create "one item"?

     

    Any examples/suggestions would be helpful.

     

    Thanks.

     

     

     

     

  20. I’m writing a small application that allows me/users to add their own inputs so they can store their own data.  It’s kind of like, a user defined database.  Think of a super simple “filemaker” kind of application.    The way I’ve set this up is like this:

    Table: item
    fields: item_id, user_id
    
    Table item_attribute
    Fields: item_attribute_id, item_id, name, value

    After my query I end up with stuff like this:

    $data = array(
                0=>array(
                            '123'=>array(
                                        'name'=>'Year',
                                        'value'=>'1985'
                            ),
                            '7'=>array(
                                        'name'=>'Title',
                                        'value'=>'Title For 1985'
                            ),
                            '25'=>array(
                                        'name'=>'Length',
                                        'value'=>'60'
                            )
                ),
                1=>array(
                            '123'=>array(
                                        'name'=>'Year',
                                        'value'=>'1990'
                            ),
                            '7'=>array(
                                        'name'=>'Title',
                                        'value'=>'Title For 1990'
                            ),
                            '25'=>array(
                                        'name'=>'Length',
                                        'value'=>'44'
                            )
                ),
                2=>array(
                            '123'=>array(
                                        'name'=>'Year',
                                        'value'=>'1965'
                            ),
                            '7'=>array(
                                        'name'=>'Title',
                                        'value'=>'Title For 1965'
                            ),
                            '25'=>array(
                                        'name'=>'Length',
                                        'value'=>'122'
                            )
                )
    );
    

    This seems to work great as there is no limit to what a user can add - it's working like I would like it to...

     

    But now that I’m trying to add sorting/filtering/pagination to results things have gotten extremely difficult on the back end.

     

    The only thing I can think of is to pull all the data and then use php algorithms to sort/filter/paginate before sending over to web page.  I can see this getting slow if I eventually have to pull 1000s of records.

     

     

    Does anyone have advice in these matters? 

    Should I rethink my db design? 

    Can the results be used to create a temporary table and then sort on it?  (don'n know how or if this is even possible)

    I need some help

     

    Thanks.

  21. Curiously the problem solved itself after a reboot.  I'm running xammp on windows 7 on a laptop and I get alot of weird inconsistencies like this when I change table structures and then run queries with php.  I find I have to restart MySQL after adding table columns or they don't get recognized from php but do with phpMyAdmin. It's strange as I never encounter these problems when using MySQL/php on my mac or production linux server.  I'll just assume it's some low-level installation glitch within xampp that I'll probably never get to the bottom of.

  22. I can't seem to get reliable results after deleting a row(s) using mysqli.  The afftect_rows is always 0.  

    I'm using php5.2 and the mysqli method

    Something like this:

    $sql = "
    				DELETE FROM classRoster WHERE id=123
    			";
    			if ($result = $this->mysqli->query($sql)) {
    				return $this->mysqli->affected_rows;   ///ALWAYS RETURNS 0
    			}else{
    				//log error
    			}		
    

    I have confirmed that the row is in fact deleted, but still get 0 as affected_rows.   Please help!

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.