Jump to content

fivestringsurf

Members
  • Content Count

    92
  • Joined

  • Last visited

Everything posted by fivestringsurf

  1. Ever have one of those problems that makes no sense? Yep that's me for the past 2 days... I've had apache/php stack set up on my osx machine for at least 6 years with little to no problems. I had all of my v-hosts for local sites under /etc/apache2/other (they were included via httpd.conf) All of a sudden my local sites stopped working and after tracking down the problem by using: >sudo apachectl -t I found out that my v-hosts where no longer there... heck the whole directory /etc/apache2/other was missing. Now I'm quite sure I did not touch this...the question remains... what the hell happened? Now I'm not foolish enough to think someone is going to be able to tell me what happened precisely but some insight would be nice: Has this happened to any of you before? Did osx auto update something and wipe things out? I mean ... I didn't update to a major release... still on El Capitan. Was it an automatic xcode update? I'm very baffled right now.
  2. I have a digitalocean account and would like to be able to auto deploy code to the digitalocean server after a push from my local machine. I've been using beanstalkapp to do this with one of my projects. I love beanstalkapp ... it's easy to use, and well documented. Not being a sys-admin this is important to me. The problem is they only allow 1 free repo and I need another for this charitable app I'm making for my school. Is there any other free repo solution I can use to push code to a staging / production server (at digitalocean) Please keep in mind I'm not a sys-admin so it has to be simple like beanstalkapp. Thanks!
  3. Good advice @kicken... sounds like a wise implementation!
  4. So I did get it working thanks to all the helpful minds here in this forum. @Jacques1 ob_flush() was key! It was really difficult to wrap my mind around the solution because eventSource wasn't as easy to work with as ajax. eventSource expects a very specific (sort of bizarre) returning structure and if even one line ending is off it doesn't work. I also couldn't grasp how to upload the file and then listen for it because you can't send files with eventSource so I couldn't get eventSource to listen in on the file upload progress. But that wasn't the biggest deal...I just used my normal ajax-style upload function with the XMLHttpRequest.progressHandler thingee to do the work. Here's what I did: Upload the file to import.php using ajax and display the progress (pretty straight forward stuff) As soon as the file is done uploading to import.php I log it to the database and generate a hash I send back the hash with the returning json back to the ajax script that started it all I immediately call eventSource to start listening in on a separate script that lives at import_progress.php (I used ?url_hash=123abc in url to pass the hash) I don't think eventSource is meant to pass vars... I was trying to be clever import_progress.php checks the db based on the hash and starts processing. Each time the processing gets through a loop it increments an (int)progress field forward in the Database and immediately echos the progress out followed by ob_flush(); flush(); Meanwhile back on the client side we're listening to the echos and manipulating a progress bar Maybe it's just me but I really felt like I stretched the technologies, PHP in particular to the limit here in forcing it to behave in a way it was never designed. Passing the $_GET variable in step 4 felt a bit janky but I didn't know any other way to do it. Once eventSource is called it has no knowledge of what has been uploaded so this was the only way I found to do it and it can't monitor the ajax upload as far as I know. EventSource is kind of dangerous, it keeps calling the script. One time I wasn't paying attention and images kept on getting created...I can only imagine if I decided to go to bed and not fix that or at least close the browser - yikes. I'm going to have to go through my image processing classes and craft some very clever fail safes so EventSource doesn't get hung up. Maybe I can even time it out on the client side if no progress is being made after a certain time period... We'll see. I've won this battle but there's much to do.
  5. @kicken so I tried some code with fastcgi_finish_request() and unfortunately I got this: Fatal error: Uncaught Error: Call to undefined function fastcgi_finish_request() So I'm sure it's some apache mod I'm missing. I looked into it and I think getting that going is above my pay grade...it looks complicated and the more I read I discovered that there can be issues with logging. hmmm It's late but I think what I might try tomorrow is a 3 prong approach to keep all 3 phases of the script separate. Here's what I'm thinking: Upload file and report back progress (using ajax or this EventSource thing) Once complete, send a second call to server to start the processing and don't bother listening for a returning msg Now start polling the server to "listen" on it's progress (the processing will update DB on progress) It's what I have in my head anyway... I'll try it tomorrow.
  6. @Jaques1 I set up a test environment and ran your code. Interesting idea but here's what happens: It works (kinda) but it throws all results back at once. For instance after loading the page there is no response from server and then after 100 seconds it all shows up in the console. Then after 100 seconds it does the same thing again. I can confirm that this is the output/behavior in both ff and chrome Not sure if this is a limitation of my server environment. I'm running php7 on OSX (my local testing rig)
  7. @kicken, I think the only part I was missing is the cron job, because what you described is precisely what I built. running cron every minute? would that be intensive on the server? or is this a routine kind of normalcy one can expect? @Jaques1, server-events? hmmm that seems enticing. but would php be able to echo out progress (ie: json ) while in the middle of processing? I thought once php is processing nothing can be echoed out until it's complete? Please clarify if I'm wrong because that could be a game-changer indeed. An exception of course would be monitoring the file upload progress. @Psycho - I incorrectly described the situation, my fault. The browser isn't locking up of course as it's an asynchronous call. What is happening is the return response is hanging up until all the processing is completed. Even if I do this: $uploadfiles(); echo 'success, hash=123'; $processImages(); Even though the echo is before the processing directive...it never get's sent until the entire script is completed. So I believe I have to separate the workflow into 2 scripts called separately.
  8. I built a "bulk importer" that takes a .zip file filled with images and a corresponding csv file that holds attributes. I'm happily using some JavaScript to provide upload-progress feedback to the user. So if the .zip file is say 10mb... they are seeing it's upload progress. (im using AJAX) This is all working nicely BUT... Once the .zip hits the server I need to do A TON of processing. Each image has to be converted into 10 different sizes, cropped, etc... All entries must be entered into the Database and admin text logs created. All of this actually works just fine for small files <10mb and I'm sure it could work with bigger files by increasing timeout time,etc... BUT the browser "locks up" during processing and there is no real way to inform the user about the progress of their files being processed. I thought maybe I could be clever and create a "progress table" in the db... and use it like this: As soon as the .zip file is uploaded to the server I create a row and an id. Next I send that id back to the browser (AJAX) and immediately start the laborious processing. The processing would continually update the DB with it's progress. The js would receive the id and keep polling the DB to check on the processing progress and ultimately report this back to the user. Well my brilliant scheme doesn't seem to work and everything locks up regardless. I think I was trying to fake multi-threading and I'm not sure how to solve this problem. My end goal is to crunch huge files and keep the user notified of it's progress - Does anyone have good advice?
  9. I spent the afternoon playing around with node.js tuts because node.js / NPM seems to be requirement for all these newfangled front-end managers... but I quickly learned that node.js IS it's own server language to be used instead of php. That abruptly ended my "delving". @kicken - do you use these tools specifically with PHP. (are you using bowerPHP ?)
  10. Over the past year I started using composer and have realized that using a dependency manager keeps development and code maintenance so much easier. yay composer! I can see an equally big need to do this for front side technologies (ie: js & css) What exists for us PHP developers to help maintain that huge mess of front end stuff we need to include. Is there something to manage and minify JS/CSS that works well with the PHP environment? Thanks
  11. I've been using composer and like the idea of having managed libraries/dependencies in php. I'm having trouble understanding how to call libraries with autoloading. Some of the package authors give great directions and some don't. Things seem very inconsistent which is really annoying. For example I'm using this image library like this: use Intervention\Image\ImageManager; $imgMan = new ImageManager(array('driver' => 'gd')); //etc... Awesome! But I can't figure out how to do something similar with firephp and mpdf (namespacing)? I figured out that these do work: $firephp = FirePHP::getInstance(true); $mpdf = new mpdf(); But, why all the inconsistencies and different ways of doing things? I'd like to keep everything neat and namespaced all in a similar way. Can anyone offer some advice?
  12. @kicken, excellent answer. The reason I was most annoyed by this was when a user looks at their download history if they have been using an android device everything is doubled! I guest the solution is: a) screw you android users (you're log activity is doubled for download requests) b) don't log requests made by same user/link within <3 seconds of one another Thanks again for your answer.
  13. I have a php application that serves pdf downloads. It works fine on all devices and browsers with one small but really annoying side-effect (edge-case for sure) When I look at my download logs anytime the download is triggered from Chrome on Android it is called twice! Bizarre behavior and I can't figure it out. Some background: The download is a pdf that get's created on the fly. All requests get processed through my index.php controller. I was serving the request with javascript via: window.open('export?file=something_to_inform_the_controller'); Works great in all browsers and devices but android chrome triggers this twice. So I got wise and though maybe a direct link would work better: <a target="_blank" href="http://mysite.com/export?file=abc123" download="file.pdf">DL link</a> or <a target="_blank" href="http://mysite.com/export?file=abc123">DL link</a> or <a target="_self" href="http://mysite.com/export?file=abc123" download="file.pdf">DL link</a> or <a target="_self" href="http://mysite.com/export?file=abc123">DL link</a> Nope, none of these flavors prevents android/chrome from double downloading. Then I researched my php header settings and tried: content-disposition: inline vs. content-disposition: attachment with no success Note, the download is logged when the controller processes the request for the download. I have duplicated download events for all downloads on android/chrome. It's strange that I have not found a solution online for this or maybe I'm overlooking something silly. Any ideas?
  14. @kicken, thanks for your insightful response. I can tell you've been this path before. The only issue is I develop on a mac and naturally the production environment would be a linux so the staging to production environments would be drastically different. I was thinking that staging would be a subdomain on the production server with access limited by ips. Do you see any problems with this? Either way, it's time I get my hands dirty with some more linux / command line stuff. I think I might go get some cloud hosting for almost free and practice. (there are some free/almost free options now) And it seems like a pure linux environment with no restrictions.
  15. I've spent the last year building a web application on my local machines using the typical LAMP stack. I've been a developer for 10+ years and am fairly good when it comes to scripting but the server/hosting/system admin thing scares me. I've taken tons of sites live but they always exist on shared hosting and require minimal maintenence...simply ftp changes...no big deal. With my latest personal projects I've used revision control (git or mercurial) simply as a way to let me work from different machines. It's awesome. I push code from home, work, and my laptop and everything is in sync with one another. It really has changed the game for me. ( I use bitbucket) My latest project will involve paying customers and has a huge code base. FTPing files is not going to cut it. I've heard of having a "staging" environment so that you can push code to the staging environment, test it, and then push to production. That sound perfect! Every time I google git/staging I get pages and pages of command line stuff. I'm used to using version control GUI's like tower and sourcetree. Are there server environments that would allow me to use a GUI to manage version control? Or are linux server environments command line only? Are there any hosting companies you know of that would be a good fit for these needs? I'm looking to keep the hosting <= $20/month Thanks
  16. @Barand Funny, I didn't even know this was a "model"...I was just doing what made sense to me But boy what a nightmare to filter through stuff. I figured out I can sort like this: ORDER BY field( attribute.name, 'Year' ) DESC , value ASC But I can't limit the query in a useful way for pagination and I can't limit the scope of the search as in WHERE 'year' = 2014 ... because the year only exists as one attribute of many. It's such a headache that I'm thinking of a redesign maybe. have two tables: table: item fields: item_id, attr1, attr2, attr3 ... table: item_meta fields: item_meta_id, item_id, data_type, name This way i can perform normal operations on a single table and refer to the meta table for the user generated names, datatypes, etc. I would just have to limit the amount of attributes a user could define. so table: item would have up to say attr50 (50 fields) I wonder if this idea is some type of "model" too?
  17. @Barand, sure I can do that with regular arrangements when I'm sorting on a column/field, but with my db the way it is now; there is never a field to sort on. For example, look at my sample array. If I wanted to sort based on "Year" there is not a field labled "Year" in the database. Instead years are stored in the table: item_attribute under the field:name="Year" and field:value="1985" So a row would look like: item_attribute_id | item_id | name | value 12 | 3 | Year | 1985 Upon further investigation maybe I can I use the ORDER BY FIELD() clause/function ? And how can I retrieve specific number of rows for pagination if I'm using many rows to create "one item"? Any examples/suggestions would be helpful. Thanks.
  18. I’m writing a small application that allows me/users to add their own inputs so they can store their own data. It’s kind of like, a user defined database. Think of a super simple “filemaker” kind of application. The way I’ve set this up is like this: Table: item fields: item_id, user_id Table item_attribute Fields: item_attribute_id, item_id, name, value After my query I end up with stuff like this: $data = array( 0=>array( '123'=>array( 'name'=>'Year', 'value'=>'1985' ), '7'=>array( 'name'=>'Title', 'value'=>'Title For 1985' ), '25'=>array( 'name'=>'Length', 'value'=>'60' ) ), 1=>array( '123'=>array( 'name'=>'Year', 'value'=>'1990' ), '7'=>array( 'name'=>'Title', 'value'=>'Title For 1990' ), '25'=>array( 'name'=>'Length', 'value'=>'44' ) ), 2=>array( '123'=>array( 'name'=>'Year', 'value'=>'1965' ), '7'=>array( 'name'=>'Title', 'value'=>'Title For 1965' ), '25'=>array( 'name'=>'Length', 'value'=>'122' ) ) ); This seems to work great as there is no limit to what a user can add - it's working like I would like it to... But now that I’m trying to add sorting/filtering/pagination to results things have gotten extremely difficult on the back end. The only thing I can think of is to pull all the data and then use php algorithms to sort/filter/paginate before sending over to web page. I can see this getting slow if I eventually have to pull 1000s of records. Does anyone have advice in these matters? Should I rethink my db design? Can the results be used to create a temporary table and then sort on it? (don'n know how or if this is even possible) I need some help Thanks.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.