Jump to content

QuickOldCar

Staff Alumni
  • Posts

    2,972
  • Joined

  • Last visited

  • Days Won

    28

Everything posted by QuickOldCar

  1. That's why having it a string would be better, don't need to deal with these issues.
  2. It's said that char is 50% faster than varchar, I've never done any performance tests on it. Although if stored larger with varchar and the data is less, it will only use as large as the data is, so some space savings and room to breath, possible less problems (like if had a space and didn't trim, change in code,etc...).
  3. Stick with char(60) then, no sense doing as binary if is not binary data, would need to do string to binary conversions
  4. Everything in a url is case sensitive, browsers will lower the domains though. So if is a path/folder/file should always do them proper case. What you see there is not a real location but added into the url for seo purposes, as requinix said they probably just use the id. Could be a fancy rewrite rule and the url could look something like this http://www.ebay.co.uk/?itm=toshiba-satellite-c50-amd-e-series-1-4ghz-dual-core-15-6-inch-500gb-2GB-laptop&id=151458954958 The parameters would be able to be uppercased for display and lowercased for usage. Edit: Yes this works the same http://www.ebay.co.uk/itm/151458954958
  5. I use varchar(255) because mine vary in length If the values are always the same length go with char, if they will vary use varchar varchar will trim the spaces if is less characters than assigned, char will not and pad with spaces
  6. Define body variable and then add to it this way. $body = ''; $body .= "Name: ".$_REQUEST['Name']." \n"; $body .= "Email: ".$_REQUEST['Email']." \n"; Or use heredoc or nowdoc $body = <<<EOF Name: {$_REQUEST['Name']} \n Email: {$_REQUEST['Email']} \n EOF;
  7. I don't see the reason for a shell command, can upload multiple files through a form. When you write a script to upload files...it's you who determines where to save them or the location of it in a database. You can create piles of folders acting as category/tags and carefully place each image into the proper folder.(Too messy and hard to search) A better method is a database structure such as id,title,filename,timestamp/date,category Keep all images in the same folder and just add another column in a database to categorize it. If you want multiple categories you should create another table for categories and be marking which image id belongs to said category. Make a form with the title and category, rename the image with a filename such as timestamp.ext or pretty_safe_name_timestamp.ext Do not use their uploaded file names. file upload time() As for displaying in your script you can do a database query for categories, make a search for titles, sort by date or id For image display you append the default image folder and then the filename from the database ./images/$filename
  8. You can go that route or do something like your own api's per server that only you access and can retrieve and send data Simpler script being able to handle requests just as viable, all depends what you need to do and how much effort willing to put into it.
  9. I'll agree with requinix, if in development and testing, do it at home. When you are ready to go live try out ovh.com
  10. I think they want persons to be able to code on their site and use as an api service. A bit dangerous as they can write anything to compromise your server in many ways. It sounds like a great idea, thought of it myself, except the part about trusting others. I've been making a multiple api site that uses my own code instead.
  11. Those are just default defines, no need to edit those. The cache is saving your scripts by address bar names. Make a script with dynamic values from your address bar, the script connects to the api with the dynamic values. Your script connects to external api, returns json data on the page. You now would want to connect to your script as if it was the api.
  12. Do your connection first, then mysqli_real_escape_string() or the values will be empty strings.
  13. drop an empty index file in there to prevent browsing it
  14. If an image is shown on a page it can be accessed, that's the way the net works the code below will at least stop them from using your bandwidth, direct visits using htaccess rules, hotlink protection RewriteEngine on RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?yourdomain.com [NC] RewriteRule \.(jpg|jpeg|png|gif)$ - [NC,F,L]
  15. I totally get what you are saying, for the people stuck at a server that won't upgrade, bought an expensive script which now has outdated functions. No longer supported and will not be a newer version. I looked into the same thing like a year ago, I was pulling in all the functions included in a script, determine php version and have a list all available functions, for ones no longer there make a substitute function same name that can do a similar task. It seemed like too much work to me covering every possible version and the multitude of functions along with extensions that could or not be supported. Maybe if there was a few less versions and extensions be a bit easier, is like a mangled web the way it is.
  16. I'll tell you how I cache my api json responses locally I store my cache in /var/cache/api/ permissions set to 755 and owner www-data Not sure how you have your api set up, but my scripts are includes through the server if passes api checks I believe you are trying to cache external api requests? Well this can cache anything, make a script that fetches and displays the data, call on that script, the cache will do the rest. You can change the folder location to where you need it. api-cache.php <?php class cache { var $cache_dir = '/var/cache/api/'; //cache storage location var $cache_time = 86400; //cache storage time in seconds var $caching = false; var $file = ''; function cache() { $this->file = $this->cache_dir . urlencode( $_SERVER['REQUEST_URI'] ); if ( file_exists ( $this->file ) && ( fileatime ( $this->file ) + $this->cache_time ) > time() ) { //read cache $handle = fopen( $this->file , "r"); do { $data = fread($handle, 8192); if (strlen($data) == 0) { break; } echo $data; } while (true); fclose($handle); exit(); } else { //create cache $this->caching = true; ob_start(); } } //close cache function close() { if ( $this->caching ) { $data = ob_get_clean(); echo $data; $fp = fopen( $this->file , 'w' ); fwrite ( $fp , $data ); fclose ( $fp ); } } } ?> Usage require_once('api-cache.php'); $ch = new cache(); echo date("D M j G:i:s T Y"); //cached content area $ch->close();
  17. Should just update to mysqli, is not that difficult. Code should be updated because is usually good reasons why those functions were abandoned.
  18. If gave more info the site wanted to scrape or the data might make it easier. Is various ways to connect to websites. This all varies on the type of data and complexity of what are trying to send or retrieve. curl would be my preference over all of them, you have lots more control over everything. A simple way to connect would be using file_get_contents() For xml can use simplexml_load_file() , I prefer to use curl for connection and responses, then simplexml_load_string() to create an object Once you get the raw data, you need to access the items you want. This would be called parsing. json simplexml dom simplehtmldom preg_match() and preg_match_all() Once you have the data you can output it or save to your database.
  19. This query will fail if is no confession parameter in the url $confId = $_GET['confession']; $chkViews = mysqli_query($mysqli,"SELECT 'X' FROM views WHERE confessId = ".$confId." AND viewIp = '".$viewIp."' LIMIT 1");
  20. Did you do this? Find location of xampp-control.exe. It should be in the root of your installation directory. Create a file "XAMPP.ini" in that directory (so that XAMPP.ini and xampp-control.exe are in same directory) Put following in the XAMPP.INI file: [PORTS] apache = 81
  21. The answer is yes, but you need to associate whatever GET requests you want to the files you want to include in addition to other files or data (content in an array?) The example given is similar, you could expand onto it in the same way. Is there a specific reason why you can't use a database? It's the best way to save and display data.
  22. Didn't hansford's example using the while for the loop help? That's what you should do instead of a for loop anyway.
  23. This doesn't look like a normal wordpress query. This returns just 10 results from startrow 0 in mysql $query = "Select * FROM table WHERE category in (1,2) ORDER BY id DESC LIMIT 10"; This tells it to start from row 0 and show 10 $query = "Select * FROM table WHERE category in (1,2) ORDER BY id DESC LIMIT 0,10"; This tells it to start at row 10 and just show 10 $query = "Select * FROM table WHERE category in (1,2) ORDER BY id DESC LIMIT 10,10"; So your query needs a dynamic value to control which row to start from, normal pagination. $query = "Select * FROM table WHERE category in (1,2) ORDER BY id DESC LIMIT $startrow,10"; Sure you can do something like a +1 or -1 to current page but it sounds like wordpress has a pagination or it's using a pagination plugin. pagination usually gets the page number and then some math on it to calculate which startrow needs to be at For instance: page 1 equates to startrow 0 page 2 equates to startrow 10 page 3 equates to startrow 20
  24. So are you trying to make a meta search, sitemaps, links index? Maybe there is a better way to go about this than the way are attempting Even if you pull 10 results in a sql limit, after you check them there could be 0-10 of them Would be best to query with data you know is there like saving the domain in a column instead of the parent numbers and always determining which one they are in a loop. When I want to save urls and know the domain...I parse the host and save as a new column upon saving the data. It gets harder doing main hosts from subdomains, but you can easily do preg_match on them. Can even go so far to say can do search or LIKE queries on the domain names or even the urls if had full ones saved.
  25. The solution would be to limit and read less emails at a time and spread it out, it must be doing way too many.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.