Jump to content


Staff Alumni
  • Content Count

  • Joined

  • Days Won


Everything posted by mac_gyver

  1. you are probably looking for an absolute file system path, not a http URL - require $_SERVER['DOCUMENT_ROOT'] . '/r/pagevisit.php';
  2. your connection probably doesn't exist and you would be getting a bunch of php errors. you need to ALWAYS have php's error_reporting set to E_ALL and when learning, developing, and debugging code/queries, set display_errors to ON. these settings should be in the php.ini on your system. next, you need to ALWAYS have error handling for all statements that can fail. for database statements - connection, query, prepare, and execute, the easiest way of adding error handling is to use exceptions and in most cases let php catch the exception where it will use its error related settings to control what happens with the actual error information (database errors will automatically get displayed or logged the same as php errors.)
  3. use (NOT) FIND_IN_SET(str,strlist). the strlist parameter can be supplied via a single prepared query place-holder.
  4. if all you are doing is mapping input values to output values, don't write out conditional logic for every possible choice. if the input to output mapping doesn't contain any calculable relationship, define a data structure (array, database table) to map the input to the output. for the example values you have shown, wouldn't you just break apart the value at the '-' character and use the first part as the lowest range value?
  5. also, are php's error related settings set to report and either display or log all errors?
  6. the code at the end is using the query() method to execute a prepared query that has named place-holders in it. the only difference in the last two sections of code is the $sort - ORDER BY ... part of the query. instead of duplicating code (which is where the ->query() vs ->prepare() and ->execute() mistake is at), just conditionally include the ORDER BY part and have a single instance of the rest of the code. do you have any error handling (exceptions would be a good choice) for the pdo statements that can fail? also, you should be using a SELECT COUNT(*) ... query to get the total number of matching rows for pagination. you are currently SELECTing all the columns and rows of matching data just to get the row count. your name LIKE search doesn't have any apparent wild-card characters, and should just use an equal = comparison.
  7. the code is intentionally disabling error reporting via the above line. comment out that line and see what you get.
  8. most likely, either your actual database name contains some non-printing/white-space character(s) or the Config::get() method, which you haven't posted the code for, is adding some non-printing/white-space character(s) to the value (i'm betting either a new-line or a <br> tag.) this code is just adding an unnecessary, pointless layer. it has no useful error handling, is using emulated prepared queries (the default), cannot be used with a LIMIT x term in a query, and can't be used with more than one connection or a different database type. if you want to do something useful for a database class, extend the PDO class and add a general prepared/non-prepared query method that will use a prepared query if there are input parameters and will use a non-prepared query if there are not. when you make the database connection, you should set the character set to match your database tables, set the error mode to exceptions, set emulated prepared queries to false, and set the default fetch mode to assoc (assoc works best when dynamically processing fetched data.)
  9. since the ssm_menu_items last insert id is generated inside the 1st foreach loop, you would ether need to insert the row in the ssm_menu_connection table there, OR store the ids in an array and then loop over that array of ids to execute the 3rd query. however, the 'INSERT IGNORE INTO ssm_menu_items...' query won't give you a last insert id if the menu item is already in the ssm_menu_items table. you would need to use an INSERT ... ON DUPLICATE KEY UPDATE query, where the sole purpose of the UPDATE part of the query is to use the MySql LAST_INSERT_ID(x) function to 'get' the id for existing ssm_menu_items. lastly, why are you using mysqli_real_escape_string() calls with prepared queries. The main point of a prepared query is it separates the data from the sql syntax so that nothing in the data will be operated on as sql syntax.
  10. i'm going with the code never copies the (unnecessary) $cart_items variable back to the session variable, so the code starts over each time it gets executed. just use $_SESSION["cart_items"] everywhere and forget about the $cart_items variable.
  11. the most immediate problem is you are not using the correct $_POST field and table column in the WHERE clause in the SELECT query (edit: i realized while writing this that the SELECT query you have shown is part of the update process, to retrieve a specific row of data to populate the form fields with, not part of a name search.) if you are entering the name (or partial name) of a game to search for, wouldn't you be searching the table column holding the game's name? why are you trying to match the id column? next, don't write code like this. there's a bunch of problems that have resulted in a wall of code that's both insecure and has created a 'cannot see the forest for the trees' problem (which is perhaps why you are using the wrong field/column in the WHERE clause.) a laundry list of issues - don't create a bunch of discrete variables for each different form you write code for. this is just a waste of typing time. instead, operated on the form data as a set, by keeping the data as an array, and operating on the elements in the array. by using exceptions for database statement errors, any connection, query, prepare, and execute error will transfer control to the exception handler. therefore, any discrete error handling logic in your code won't ever be executed and should be removed. related to #2, the only exception handling try/catch block you should have in your code are for those errors that are recoverable, that your code can do something about, such as dealing with the inserting/updating of duplicate or out of range user submitted data. all other database statement errors are non-recoverable and there's no good reason for your code to catch these exceptions. just let php catch and handle these exceptions. by using exceptions for errors and letting php catch them, php will use its error related settings to control what happens with the actual error information (database statement errors will 'automatically' get displayed or logged the same as php errors.) 'function getPosts()' - don't do this. you want code to be easy to write and debug and be readable by everyone. by intentionally using numerical indexes, you have made more work while writing this code and made more work for anyone trying to read/maintain the code. also, once you write and test the code for a function, you should not find yourself regularly editing the code. functions should not contain application specific code that you must change each time you do something new. external data can be anything and cannot be trusted. you must validate all external data before using it. you should use a prepared query when supplying external/unknown data to the sql query statement. items #6 and #7 will help avoid the current error, because you would not attempt to run a query if an expected input is empty and a numerical input that can be empty/null won't produce a query error. a name or partial name search can match more than one row (the same game name for more than one platform.) you would loop over the result from such a query and display as many rows of data that the query matched. to update/delete the data for a specific row, you would produce an edit link and a delete form for each row that is displayed. the edit link would contain the id of the row. when you click the edit link, the code would query for an fetch the data matching that row and populate the form field values. the current SELECT query seems to be for this part of the process, not for a name search. you need one more SELECT query.
  12. you should use http_build_query() when building the query string part of urls (it automatically urlencodes the values for you.) you should start with a copy of any existing $_GET parameters (before the start of you loop), so that your code only modifies the one(s) they are responsible for. you should use &amp; as the separator in links.
  13. what is the overall goal of this code? if the quantity is greater-than zero, insert/update a row matching a job_id/item_id, otherwise, delete a row matching a job_id/item_id? is this what the code/queries are doing now? note: a drink is a type/category of an item. your database table should be general-purpose and handle any type of item in an order. the table is for recording orders and should be named as such. also, the columns should just be named job_id, item_id, and quantity (or some abbreviation of those.) also, if you use the item_id as the form field's array index value, you only need one set of form fields and no extra variables when you process the form data.
  14. if you prepare the DELETE query, once, before the start of the loop, using the same conditions that the INSERT ... query is using to identify which row to operate on, you may see the mistake in your logic. btw - each header() redirect needs an exit; statement after it to stop program execution. your current login check code is still running all the code on the page even is someone is not logged in.
  15. it is inefficient to query for data inside of a loop. your goal should be to use the least amount of queries that get the data you want, generally in the order that you want it, then just loop over the data to produce the output. when the query to get data involves going to an external source, such as using curl to read xml data from an api, the problem is even worse, due to the communications and the parsing of the data. you want to reduce the number and size of the communications and you want to reduce the number of times you parse the same data values. you need to use a data-centric approach, rather than a piecemeal approach to getting data. for the application you are creating, what is all the data you need, and can you get it all using a single query? if not, can you get multiple sets of information (the custom field information) for multiple ids in a single query? also, how often does any of the source data change, so that you would be able to cache previously fetched and parsed data locally in a database?
  16. this symptom is typical of a changing host-name/sub-domain in the URL (a www. vs no www) and the result of being "redirect happy" and redirecting all over a site. if you initially visit a site with a url that does/doesn't have a www, then perform a redirect that uses a different host-name/sub-domain than the initial url used to reach the site, the default session id cookie domain setting will cause the session id cookie to no-longer match, and the initial session id is no longer sent from the browser to the server. after the initial redirect, all the variations of the URL are now the same and the session id cookie works as expected. so, 1) be consistent in all the URL's that you use in links, form actions, redirects, ... on a site (this alone won't solve the problem since someone can type any variation of a url or have a short-cut/book-mark with any variation), 2) set the session id cookie domain setting to match all variations of your domain, and 3) set up a htaccess redirect to cause all requests to goto the same variation of your domain name.
  17. you would also handle the user id differently, depending on if a user is editing his own profile or if a moderator/administrator is editing someone else's profile.
  18. dynamically build the sql query with only those fields that you intend to update. since this will also involve dynamically binding the input data, this would be a good time to switch to the much simpler PDO database extension, that will simply let you build and supply an array consisting of the input values that match the prepared query when you call the ->execute([...]) method. note: the account_level and role_id are permission related and shouldn't be included in the profile edit process when the user is editing his own data, but could be included if a moderator/administrator is editing someone else's profile, so these two fields would need to be dynamically handled depending on who the current user is. you may want to only edit them through a moderator/administrator permission edit interface, rather than to have them as part of the profile edit interface. if you are doing this for real, you need to test and enforce user permissions to insure that the current user is authorized to both see and process a profile edit form. if you store validation error messages in an array, using the field/column name as the array index, you can test at any point if there's an error associated with any field/column name, by using isset(). you can test at any point if there are no errors or there are errors by testing if the array is empty or not empty. copying variables to other variables, without a good reason, is a waste of time. a good reason to do this would be if you were trimming the data. you can do this using a single php statement that will trim all the data at once.
  19. neither do we since we cannot see the problematic code from where we are sitting.
  20. your should x/* out information in posts, rather than to delete it, as that changes the meaning of what you post. site, personal, sensitive, configuration information used in code should be defined in a separate .php configuration file and required into your main code. this will let you post any of your main code as is without needing to alter it.
  21. that information was a miss-statement. this code is not processing an email. it is processing the result of clicking on a link that was sent in an email. another issue with this code is it ONLY works because you are reading the email on the same computer where the web server is running. the URL being sent in the email is a relative URL. to do this properly, the link would need to be an absolute, fully qualified URL, so that when it is clicked/copied to a browser's address bar, it will work regardless of what computer the email is opened on.
  22. reproducing something you see on the web isn't learning. it's you functioning as a human photo-copier machine, where the quality of the output cannot be any better then the quality of the input, which in this case is filled with mistakes, bad and out of date practices. ignoring for the moment that data is almost never actually deleted in real-life (its updated to indicate it is no longer being used), for the current operation of deleting a specific record in a database table, what inputs do you have or need, what processing are you going to do based on those inputs, and what result or output are you going to produce? defining the Inputs, Processing, and Output (IPO) is the most useful thing you can do before writing any code. once you have defined these things, you can go about designing, writing, testing, and debugging the code needed to accomplish the steps it takes to perform this operation.
  23. sorry for all the negativity in this reply. code that doesn't work and doesn't contain any useful comments, doesn't tell us what the expected result is. it would be more helpful if you posted an .sql dump of some sample data and show or describe what result you expect based on that sample data. some comments about the posted code - be consistent in what you name same meaning values. the company_id value is being referred to as - userClient, Engineer_Company, companyID, and Company_ID. don't use addslashes() at all. use a prepared query when supplying external/unknown data to an sql query statement. don't run SELECT queries inside of loops. don't run the same SELECT query multiple times. don't mingle sql specific 'business' logic, that knows how to query for and retrieve data, with the 'presentation' logic. don't copy variables to other variables without any reason. don't write unnecessary html markup. don't pass unnecessary values through forms. don't echo static html.
  24. don't execute SELECT queries inside of loops. with today's server hardware, the time it takes for php to send the sql query statement to the database server is several times longer than the time it takes the database server to execute a query. you want to limit the number of queries you execute. postgreSQL has a LIMIT n OFFSET m statement that you should use for pagination. if for some reason you are not supposed to use that for this assignment, use php's array_slice() on the $listings array to get an array of the desired ids. then use an IN() operator to get the desired rows of data using one query. btw - the header() redirect needs an exit/die statement to stop program execution. without an exit/die, all the rest of that code is still executed. since you haven't shown us what the build_listing_card() code is or what output it produces, cannot help you with what or why your view listing links should be or don't work. don't write code like this either. every pg_fetch_result() call performs a data-seek, followed by a fetch. this takes twice as long a just fetching the data, and you have a couple of dozen pg_fetch_result() statements. the query in this code will match at most one row of data. just fetch that row into a php variable, then access elements of that fetched array. this will run measurably faster and take a lot less typing to produce the code needed for the page. if your initial long list of echo statements are for debugging purposes, just use print_r() or var_dump(), surrounded by html <pre>...</pre> tags to format the output. if this output is instead a desired part of the page output, don't spend your time typing out line after line of code for each possible field/column. use a loop and let php do the work for you.
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.