QuickOldCar
Staff Alumni-
Posts
2,972 -
Joined
-
Last visited
-
Days Won
28
Everything posted by QuickOldCar
-
I'm not sure what your redirect function does. If you use header(), do an exit; after so the rest of code does not continue. Seems to me it's doing this rule constantly and passing empty p parameters. Here is... $page_url_rewrite = substr(rawurldecode($_SERVER['REQUEST_URI']),1); $_SERVER['REQUEST_URI'] will hold the full request path including the query string redirect($config['server_protocol'].$config['server_name'].'/?p='.$page_url_rewrite); Going by your rules will add the p parameter and then also any directory plus additional query string $page_url = request_var('p', ''); //what is request_var() function doing? $page_url is defined as being set and blank, why add this if not using it? The $_GET or $_REQUEST array contains the parameters and values current in the url.
-
Honestly the vanilla wordpress versions are not something want piles to be managing across your server. Can enable the wordpress multisites and have an enormous amount of more control with some very important additional plugins. When talking "professional hosting"...something like a virtualization manager using KVM, OpenVZ, Xen or VMWare. Hosting virtual OS, virtual dedicated servers, virtual private servers. Account management, help tickets and support, cpu, memory, bandwidth, storage limitations. Client picks an OS, installs own server or turnkey package Client can only have access their own containers Waaaay back I hosted using hypervm and kloxo. Required centos. Was never an easy thing to do then or even now, be prepared to get schooled with all this. Hard to find the perfect for everything and flaw free solution.
-
Which web application i used to develope this?
QuickOldCar replied to deepakk's topic in Applications
Will be blunt, it may be beyond your ability? Am not trying to be rude, just realistic about this. It's not a simple or small application. Get someone who knows what are doing and save yourself an enormous amount of time. You have an error_log file, try there first Move onto any other error logs...for example the log files for the server stored in apache and mysql folders. Windows or Linux based server? Locations for these will vary depending the operating system and then even some distro's. Follow along what has in this link, will give you some information will need about your configuration. http://php.net/manual/en/function.phpinfo.php Make a file and paste this code within. Name this phpinfo.php and visit this new script in the url. Trying to run in a local server? http://localhost/phpinfo.php Have a qualified server online? http://yourdomain.com/phpinfo.php <?php phpinfo();?> Some information how you managed to have this? Was there ever a time was error free and worked? Made specific for you? Found on the roadside like an abandoned puppy and kept it? All of a sudden not working? Server updates or new host then stopped working? Would be so many questions... If felt like opening some files in an editor to make some changes like a new domain or file locations for the application...try notepad plus If you have error reporting enabled and had any specific errors feel free to post here as that could help narrow down issues. If you post a huge error log file some may not sift through it. Again..just going by how you asked the question...should probably hire someone that does this. We have a wonderful for hire jobs section . -
help with _GET, assigning to variable and checking that variable
QuickOldCar replied to SF23103's topic in PHP Coding Help
Single = to assign, double == to compare if equal, triple === to compare if equal and also same type http://php.net/manual/en/language.operators.comparison.php -
You would build a form and then to make easier use phpmailer to send the email.
-
You are dealing with a large amount of data. I would first ensure you have the tables properly indexed. http://dev.mysql.com/doc/refman/5.7/en/create-index.html
-
PHP is totally different different. PDO is a database wrapper allowing you to connect and manipulate a database. Similar to mysql or mysqli. Mysql functions are deprecated and removed in new versions of php.
-
Need Help to make advance search in PHP website Project
QuickOldCar replied to AMITKUMAR's topic in PHP Coding Help
If post some sample data each database may be able to get more help. What should have is something like this in the end to compare train number, station number/location arrive, station number/location depart, arrive time, depart time -
You may have to trim if is saved as a space in database SELECT * FROM `schedules` WHERE `clientID` is NULL
-
Do another query to that table Written so will only populate array if are results, otherwise is an empty array. I didn't know the actual name of your table so change that. $ar = array(); $sql = "SELECT offense_comment FROM comments_table"; if ($result = $conn->query($sql)) { if (mysqli_num_rows($result) > 0) { while ($row = $result->fetch_assoc()) { $ar[] = $row['offense_comment']; } } }
-
Just for the record a user can delete a cookie file or gets deleted when closing browser. If is something you want to control and not the client use session
-
Best way to implement a registration setup in setup
QuickOldCar replied to Da9L's topic in PHP Coding Help
There is just so much to explain and write about this topic. I sent you a PM and willing to give you the grand tour through teamviewer on my server of how I do it. It would be easier to explain and show it. A lot of people use oauth for their api security but am not too sure about it's current security the latest version. I build my own REST api's. The way I go about it is to make a front door script first. This would be the api.myserver.com , api.myserver.com/script/ , api.myserver.com?app=cool_app_name address or however would like to structure it. Will do all checks needed such as check if is a valid public or private key, use that key to query and find that user, if user paid up or not, domain protection and so on. If all the above criteria is met you would include your application script so they can use it, otherwise access denied message. For api systems I usually use json responses as default. Since is a REST design can do multiple header requests for which format a client would like to use. If the client wanted something like xml or html, I would fetch the json file and output their format choice. I usually cache json files to eliminate excessive usage. If no cache file exists would get live data...creates the new json cache file, otherwise use the data from json cache. Is an expire time on the file. To answer some of your questions: You would need a user registration and login system Once user pays you assign them a randomly made hashed key incorporating something such as a user id or username as salts to ensure is unique, store that under their users account in the database. You have another column for that user in your database if payment is made or not using 0/1 or n/y values. -
You would want to do the row count for this query $res = mysql_query("SELECT * FROM article LIMIT ".$per_page." OFFSET ".$offset); The $allrecords is being used to create pagination links all records with no offset or limit. Also check row like this...a number. if(mysql_num_rows($res) < 1){ header("Location: https://localhost/pagi/articles.php"); exit; } Can link back to the main script since you already set page to 1 if not set
-
If is no result mysql query or row count is less than 1...do a header redirect to url with a ?page=1 Don't forget to use exit;
-
Online / Offline system for basic POS system
QuickOldCar replied to CamaroMan's topic in Application Design
Live online server not local. Designate a main local server with a local address changing hosts file and servername Any additional stations always connect to the local main server. These wouldn't require local servers installed. When has internet... local main server can communicate to the live server in the background and make any updates. Since all other stations access just the one main local server should work out well. Since is just one server syncing should simplify it all. I suppose the data storage is up to you. I would use mysql the live server and pass json files using curl with a cron job from the main local server. The main live server can check for any new files a cron job and update accordingly. I could mention sqlite but I don't like the file locking for writes while also trying to read. The alternate would be to install local servers every machine and do updates in the background to the live server and using cached json. Don't really know if this is a single or multiple clients pos, how many stations and amount of work setting up. Client computers puke a lot so is probably best to do less installs each one. In that same respect is hard to just rely on a single computer as well. But could always set up a backup server syncing using copy commands. There is remote mysql options in which do not need multiple copies same database. I would never copy the mysql data files directly and always run through a process for newest data with timesstamps and checks.- 2 replies
-
- php jquery
- online
-
(and 1 more)
Tagged with:
-
If were to get an empty value for the filename in the array it would still be true because file_exists() would detect the directory, if is just files use is_file()
-
Working with multiple and large csv files, comparing and manipulating data is not a good way to go about this. Use PDO with prepared statements and a database. Normalize your database structure and associate the id's another table. Once you do this would be lots easier even if you still have to read from a single csv file and insert new data into the database under those single id's. Lets get back to comparing the 2 csv files. When dealing with files the entire file is loaded into memory even if just need a single line, especially when need to loop through and check each line. In your case have to load 2 files. Take this example, you can then compare the arrays by it's line id in whatever way desire, then use array_merge(),array_combine() or as you wish to manipulate the data. It's hard to assist any more without having samples of your data and expected changes you desire. <?php $errors = array(); $the_file = "file.csv"; $the_data = "data.csv"; if (is_file($the_file)) { if (!is_readable($filename)) { $errors[] = "$the_file not readable"; } } else { $errors[] = "$the_file is missing"; } if (is_file($the_data)) { if (!is_readable($the_data)) { $errors[] = "$the_data not readable"; } if (!is_writable($the_data)) { $errors[] = "$the_data not writable"; } } else { $errors[] = "$the_data is missing"; } function array_by_id($file) { $array = file($file); $exploded_data = array(); $explode = array(); foreach ($array as $key => $line) { if ($key != 0) { $line = trim($line); $explode = explode(";", $line); $line_id = $explode['0']; $exploded_data[$line_id] = $line; } } return $exploded_data; } if (empty($errors)) { $file_array = array_by_id($the_file); $data_array = array_by_id($the_data); //preview data echo "<pre>"; print_r($file_array); echo "</pre>"; echo "<pre>"; print_r($data_array); echo "</pre>"; } else { foreach ($errors as $error) { echo "$error <br />"; } } ?>
-
parse eroor in an include file for login system
QuickOldCar replied to Michael_Baxter's topic in PHP Coding Help
Missing a parenthesis if (password_verify($password, $db_password)) { -
Maybe someone else would, am not going that far into it.
-
You need to split it into chunks of no more than 1,000 and send each chunk. Can't really help what google cloud messaging limits are. I could think of a way to do this. Create a column named sent with a 0/1,y/n,null/1 value or whatever suits you. Do a mysql query with a LIMIT of 1,000 for any messages not sent yet. AND sent IS NULL LIMIT 1000 (providing set default values as null) If you need to send them exact orders query them by lowest id as well Set the messages sent column in mysql to a 1 to mark them as sent. Have a cron job running the script a timely manner, allowing ample time to complete 1,000 messages being sent via GCM. Breaking them up to even less than 1,000 will also work providing set your cron time more frequently.
-
If works with a certain amount and not more, most likely has to do with timeout or memory limits.
-
http://symfony.com/doc/current/components/http_foundation/sessions.html Symfony sessions are designed to replace several native PHP functions. Applications should avoid using session_start(), session_regenerate_id(), session_id(), session_name(), and session_destroy() and instead use the APIs in the following section. While it is recommended to explicitly start a session, a session will actually start on demand, that is, if any session request is made to read/write session data. Symfony sessions are incompatible with php.ini directive session.auto_start = 1 This directive should be turned off in php.ini, in the webserver directives or in .htaccess.
-
For things like this I would typically run a script with cron and cache the data results. Most likely doing each of those 400 you said individually. The cron script would be connecting a timely manner looking for any changes and overwrite the cache file. Your script uses the local cached data.
-
Not sure what I was thinking, maybe another coffee in order.