Jump to content

Darkstar

Members
  • Posts

    20
  • Joined

  • Last visited

    Never

Profile Information

  • Gender
    Not Telling

Darkstar's Achievements

Newbie

Newbie (1/5)

0

Reputation

  1. I have looked at phpThumb and it does, in fact look awesome. Well, folks, I'm very glad you all helped out so quickly but as it turns out it was just plain stupidity. There was a rogue block of code (forgot the else around it because it's just 2 lines long) and it was outputting the full image after the resized image. The browser would show me the resized image but still load the full image in the background. My function is very similar to Garethp except in an attempt to save computing power I put an if block around the actual resizing portion of the function. If the desired size is greater or equal to the current image size, simply display the image instead of bothering with GD. The function was only to be used for downsizing images so I didn't have to worry about making them bigger. Like I said, I forgot the else and it was ALWAYS outputting the full sized image but only showing the resized one.
  2. I'm currently playing with the idea of resizing images on the fly for certain applications and my original thought process was that if I pass a smaller size image to the browser (or flash or anything else) it should load faster because it will theoretically be a smaller filesize as well. I have a "full quality" 2000x2000px image and that's what I'd use as my base for resizing. Somehow this 1.56MB file gets a filesize of ~1.8MB when it's sized down to 1800x1800 and at a mere 300x300 it's a still 1.6MB. Has anybody else noticed this? I tried setting the quality on imagejpeg() and this gave me a minimal change in size even when set to about 50 (even though it looked horrible).
  3. Basically what I want to avoid is what you hit on, an obtrusive watermark that takes away from the experience of viewing the images. What I also don't want to have to do is open up photoshop for every image and manually watermark it because that'll take a considerable amount of time. So let me run this past you: I was originally going to use a script to add all new photos to the database and then prompt you for categories it would show up in. I figure I can make the the watermark part of this process and prompt me for where (in coordinates) to put it so it looks unobtrusive.
  4. that wouldn't solve it, unfortunately. The number would be appended but if you copy the URL and type it in the addressbar in IE it'll be cached because the randomizer won't change from the img src to the time it's loaded by itself. That would work if I needed it displayed differently on 2 pages but not if it's being called by itself.
  5. although true, most people wouldn't go through the trouble although you're right, I should watermark them regardless. I just wanted it to be at least a little friendly if you were looking at the pictures on the site. I know that I hate going to sites and seeing watermarks on images that are simply being displayed in the page. Maybe the solution i'm looking for is smaller thumbnails that link to a watermarked, larger image.
  6. the while sets $row so you don't need the above, commented, line. As for it still showing blank. try print_r($row) in the while instead of the echo to see what it really stored. Work from there
  7. Let me start at the beginning. Somebody wanted me to code their website and it's a photography site so I wanted to disallow image downloads for obvious reasons. I was going to use a few tricks to accomplish this. I would overlay a transparent image over the actual image so that if somebody right clicks and hits save as they don't get the image. If they're smart enough to view source they'd see the following: <img src="nodownload.php?image=image.jpg"> nodownload.php checks to see if it's being called from in a page on a certain domain, if it is it'll use gd, grab the image and display it. If the page is being hotlinked or no HTTP_REFERER then it'll watermark the picture with GD then output it. This works fine on firefox because firefox isn't caching the output of the php page so if they right type the url of the image it runs through the script again. IE on the other hand is caching the output and displaying the picture it displayed when called from the <img> tag which defeats the purpose of having the code at all if it's going to get cached. I need a way to force all browsers to run the script (not the containing page) instead of caching it. Thoughts? If not, is there a better way or doing this other than mod_rewrite? If it comes down to it I guess I could simply use mod_rewrite to redirect to an error page.
  8. The only other thing I can think of is the possibility that because this particular page is called by ajax that if the user navigates away before it finishes loading/running it may stop the script, but i'm not sure if that's even possible. Would the script continue running or would it stop? There's more to the actual script, that part fetches it, then it's parsed as XML and put into a variable. The variable is then saved as XML locally as a cache and the variable is then used as output. If the script is stopping due to a user navigating away, is there a way to make it run completely through?
  9. $timeout = 5; // set to zero for no timeout //retrieve file $ch = curl_init(); curl_setopt ($ch, CURLOPT_URL, 'http://www.xxxxxx.com/xxxxx.aspx'); curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, $timeout); curl_setopt ($ch, CURLOPT_TIMEOUT, $timeout); $file = curl_exec($ch); this is what i'm using right now.
  10. I basically have a script that fetches info from another site using curl. To avoid multiple instances of the fetch at one time I have it create a file as soon as it begins to fetch and set a curl timeout for the fetch. The problem I'm having is if it times out it doesn't remove the file, it just times out and doesn't continue. I looked up possibilities and found set_time_limit(0); but i don't want it to be able to run forever, I just want it to write the file, attempt to fetch and if it hasn't completed after x seconds, remove the file. any thoughts?
  11. yea, i actually took care of it.  i also realized if you use the same query string it caches that as well.  so i can use the same query string until it's time to download a new picture from the original server and have it cache it until the new one is downloaded.  thanks for the help.
  12. would any of you happen to know how to force images to update?  if the image that's fetched via the script changes, my script will download the new one and save it with the same name as the old one.  is there a way to force it to refresh the image cache?
  13. heh, well if it's not a flaw then it's simply a pain in the ass
  14. as it turns out, IE has another flaw when it comes to AJAX.  it refuses to fetch the actual content on the fetched page but will instead use the last cache of it.  I searched google for a minute or two and came up with the following: [code]setRequestHeader("If-Modified-Since", "Sat, 1 Jan 2000 00:00:00 GMT");[/code] i just inserted it into: [code]  xmlHttp.open("GET",urlFetch,true);   xmlHttp.setRequestHeader("If-Modified-Since", "Sat, 1 Jan 2000 00:00:00 GMT");   xmlHttp.send(null);[/code] and voila!  no more cache
  15. good call. IE seems to process the JS differently.  I hate IE.  It didn't process the page title as a part of the responseText and on top of that it jumps the gun trying to replace HTML items before they've finished loading.  I had to make the page title another div and add a delay for the first loading of the index.  At least it seems to work on both browsers now.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.