Jump to content

Psycho

Moderators
  • Posts

    12,157
  • Joined

  • Last visited

  • Days Won

    129

Everything posted by Psycho

  1. I agree with Ch0cu3r, the DB connection or query are likely failing - but you aren't checking for errors. I'd also suggest separating the "Logic" from the "Output". Makes debugging much easier. Try this. <?php //connect to the database $dbc = mysqli_connect(DB_HOST, DB_USER, DB_PASSWORD, DB_NAME); if (mysqli_connect_errno()) { die("Connect failed: " . mysqli_connect_error()); } //grab the city names from the MySQL table $query = "SELECT cities FROM locations"; $result = mysqli_query($dbc, $query); if(!$result) { die("Query failed: " . mysqli_error($dbc)): } $cityOptions = ''; while ($row = mysqli_fetch_assoc($result)) { $cityOptions .= "<option value='{$row['cities']}'>{$data['cities']}</option>\n"; } //close the db connection mysqli_close($dbc); ?> Select a city: <select name="city"> <?php echo $cityOptions; ?> </select> If you still get a blank select with no errors, then I'd guess that there are no values for 'cities' in the 'locations' table. But, you could check for that as well.
  2. A few things: 1. Don't use the short opening PHP tags (unless you are using the short form to output a variable) 2. Don't "mix" the PHP and HTML output. Separate the logic from the output. It will make your code much easier to read and much more flexible. 3. Don't put numbers in quotes when you are comparing numerical data if($giftCard->rowCount() > '0') 4. Don't run queries in loops! It is a performance killer. To solve this problem, I would suggest putting the query results into a multi-dimensional array to make it easy to know how many rows per parent item when you create the output.
  3. Here's my parting thought. I think you are making assumptions with no data to back it up. If you did understand all the benefits/drawbacks of using a cache file vs caching in memory I'm sure you would already know the answer to your question. Typically, you would only consider storing data in memory if you have already determine that there is a performance problem. And, yes, PHP does have a way to do that. You're a smart guy. I'm sure you'll figure it out.
  4. Yeah, I'm not even going to touch that. I already stated it will be a LOT of work and, as ginerjm just stated, any changes in the layout of a site would break the code. If you want to do this, you need to learn to do it because you are going to need to support it as those changes occur. I already suggested that you do this for one site first. Once you've done that you will have an idea of the full scope of work. Here are the basic steps you would want to accomplish: 1. Determine how a search is done on the site: POST or GET. If it is get, doing a search if easy. Just append the search string to the URL using the appropriate name/value pairs. If it is POST, you can use cURL or (with PHP5) you can use something such as shown on this thread: http://stackoverflow.com/questions/5647461/how-do-i-send-a-post-request-with-php 2. Next do some searches through your browser on the site and inspect the HTML source code of the results. Make sure you do a LOT of searches to try and detect any variability between the results page and how the content is structured: no results, one result, multiple results. You say there would only ever be one result. Go ahead and use that assumption, but I would not. What if a search did not find a match but that site instead provided a result of something that was "close" instead? 3. Once you have analyzed all the variations in the output of the search results, build the logic for that site to extract the data you want: i.e. the price. If you get that far for one site you will then understand how much work would be involved for multiple sites. Of course, that doesn't cover the process of taking the results from multiple sites and putting them into a combined output.
  5. FYI: Keep in mind that a "cache" is not some magical entity - it is simply a file. A cookie and session data can be considered cache files although there are specific functionality on retention and the ability to set and get values. What has been proposed above about creating a file for each language is definitely the way to go and is, in effect, a cache file. Reading a language file is very fast and this pattern is used a lot. Plus, it is simple to allow a user to "change" their language and load a different file. The language selection should be saved as a cookie value whenever they make a selection. On a side note, I know it is possible with some technologies to actually store such data in memory to be used from request to request even for different users. Maybe this is what you were really after. I don't have any experience with this in PHP, so I can't comment on that. But, I've used language files many time.
  6. Start by analyzing the HTML content of the search results and start building the code to extract the content you want from the page. But, trying do this without a DB AND creating a combined output in CSV will be problematic. Performing a request from many sites for many products will take a long time. So, either you would have to run the script from the command line (so it won't time out) or you would have to set it up refresh after each request. And, doing the latter would complicate creating the CSV file since you would have to open it, write to it, then close it on each execution of the script. So, start by creating the process for a single site to perform a search, get the results and parse the results. Then you can move to the process of creating a framework to run the code for each site and output the results. I would suggest using simple_html_dom
  7. What do you mean it can't happen? How do you know? What if there are 26 matches to your search and the search results are displayed in a paginated fashion with 25 results per page? Unless you have already done that analysis and know that no particular searches will not pull many records you are just making assumptions. As for the "out of stock" scenario, you are again assuming that the price would be displayed or, if not displayed, the layout of the HTML would be the same so you can 'check' the price field. But, what if the HTML layout for an out of stock product does not match in-stock products? The logic to analyze the page could fail. Like I said, this is a very laborious, tedious process because you have to go through a lot of trial and error to produce all the possible outcome and then analyze the HTML source to determine how to build the logic. And, as stated above, if any site makes any changes your script will likely break. Why is a DB out of the question? Would definitely be easier. Plus, you could use it to "run" the scripts to determine which products to search rather than entering them in manually.
  8. Yeah, none of this is extremely difficult - but it will be a LOT of work. I have built a screen-scraper previously (i.e. a script that reads a remote web page and extracts certain data). It is a very laborious, tedious process. You will need to manually run searches and inspect the HTML results for each site to identify how the results pages are built. Then you need to build code for each site in order to extract the information you want. And, the output for a given site will not always be consistent so you need to go through a lot of repetition to determine what differences there may be and account for that. For example, you need to account for what the output look like when there are no matches vs. actual matches. What if there are a lot of matches that are on separate pages? You may need to build logic to traverse those pages to get all the data. Or, in some cases the actual HTML format could be different based on the searches such as is some products have images and others do not. Depending on how the HTML is built the parsing may need to be different. Or, what if the product is out of stock? Then, after you go and do all of that for one single site - BAM!, they change their layout and all your work is down the drain. Plus, you state you want to do this for possibly hundreds of products? That's crazy. That would take a long time and could even get identified as a malicious activity. You should definitely build a database and update it with all the products from each site on a regular basis. Then run the searches against your database. I would suggest reaching out to these sites to see if they have a service to get their current product list and pricing rather than building a screen scraper.
  9. PHPFreaks uses a complete package for the forum called IP Board which I believe some of the members have made modifications to. It costs a minimum of $175 plus regular renewal fees. If you are wanting to implement a login system to some PHP site/application you have built it isn't just a matter of downloading and installing a script. There are a LOT of questions you need to know the answer to before you just go and download something. What features do you need? Are some users going to have different rights? What is the process for setting up an account, changing, resetting passwords, etc. If you know those answers you could then do some searching to see if there is something that meets your needs. Then, it would be more than just downloading the package. It would require some implementation throughout your application. But, be careful to research any possible solutions as it would be easy for me to build such a package and post it online for anyone to use - along with a backdoor or flaw that I know of!
  10. Early days? The standard (per RFC 2109) has been at least 4K for almost 2 decades. Which should be more than enough for simple preferences. And, yes, it does make sense to create a cookie with a single array rather than having to manage individual cookies. For one, as all of us have been saying, you would just set the cookie (if it doesn't exist) on any page load. Then when the user accesses (or attempts to access) the preferences page you simple check if the cookie exists. Heck, you could even hide the link to the preferences page if the cookie doesn't exist. Otherwise you don't know "which" cookie to look for requiring you to create a "test" cookie. That only creates a dependency that isn't needed and adds complexity. Plus, with a single cookie you only have to write code to create/update the cookie once. As you add/edit the preferences it would take more time and code to implement those changes since you also need to handle the writing/reading of those new cookies. Each cookie should have a specific purpose. And, lastly, there is a limit on the maximum number of cookies that you can have per domain and you would be more likely to hit that limit than the size limit of a single cookie. Per RFC 2109 a browser only has to support 20 cookies. http://support.microsoft.com/kb/306070/en-us
  11. Most cookie checks I've seen rely on JavaScript so that the user can be alerted on the very first page load. But, that requires that JS be enabled as well. With PHP only you would only be able to check a cookie on a subsequent page load. In your situation, I would simply set the cookie on the first page load and then wait until the user goes to change a setting and alert them then that they are not accepting your cookies and that their settings will not persists past the current session. Also, you don't need a "test" cookie. Have one cookie as an array with all the settings. So, on first page load, just set the initial cookie as an empty array (if it doesn't already exist).
  12. Psycho

    multi query

    Hey, it's your project. I'm not saying you can't do it how you think it should be done. Go for it. I'm only trying to provide guidance based upon experience and knowledge. There are some basic principles with respect to best practices regarding various things such as Database normalization, data abstraction, etc. Believe me, most of us providing help on this forum have made the same mistakes as everyone else. Sometimes people take our advice and sometimes they don't. The fact that you feel so strongly in your position yet do not have the experience or knowledge to know why it is better or worse is telling. There are many reasons why that approach is problematic which I won't waste my time going into since you are not receptive to it. But, just to highlight for others that might see this post, here is one. A delete operation is much more expensive (i.e. performance drag) because indexes have to be updated. Using a "soft delete" flag in tables is a tried and true methodology.
  13. I would disagree in that it is hard-coded explicitly for two-columns. The WriteResult(); function was a nice addition though. Even with ginerjm's approach, the process could be expanded so that it is not hard-coded for specifically two columns. But, this has turned into an argument of the 'optimal' approach and, with it, comes lots of opinions. The bottom line is whether the requirements are met or not. Several methods of meeting those requirements have been provided.
  14. Psycho

    multi query

    So, when a record is deleted, you really don't want it deleted since you want "admin" users to be able to view them. You do NOT want two tables. Instead create a new field in the table called "deleted" with a default value of 0. When a user deletes a record change the value of that field to a 1 (i.e. TRUE). make sure that any queries to display records to users exclude records that are "deleted". Then make your queries for the admins which will include the "deleted" records. Copying data from one table to another only creates more work with no benefit,
  15. Although the OP asked for a two column solution, I would propose a solution that allows the columns to be defined separately. I would not hard-code it to only support two columns. To OP: - Don't use field names with spaces in them. Yes, it can work, but it will only end up costing you lost time when the inevitable problems arise. - Don't use '*' for your SELECT statement if you really don't need ALL the fields - No need to make new variables from the $row array just to use them one time. The following allows you to define the number of columns at the beginning of the script. So, if you later decide you want three columns or four or whatever, you only need to change the variable at the beginning of the script and it will "just work". Note, I don't have your database, so I didn't test it. There may be a small syntax error or two, but the logic is sound. <?php // Set column count $max_columns = 2; // Check Connection if (mysqli_connect_errno()) { echo "Failed to connect to MySQL: " . mysqli_connect_error(); } // Select Data Table $query = "SELECT `Name`, `Author`, `Link to Cover`, `Link to Profile` FROM Recommendations"; $result = mysqli_query($con, $query) or die(mysqli_error); // Create Content $record_count = 0; $output = ''; while ($row = mysqli_fetch_array($result)) { $record_count++; //Open new row if needed if($record_count%$max_columns == 1) { $output .= "<tr>\n"; } //Create output for current record $output .= "<td>"; $output .= "<a href='{$row['Link to Profile']}' >{$row['Name']}</a><br />\n"; $output .= "{$row['Author']}<br />\n"; $output .= "<a href='{$row['Link to Profile']}' ><img src='{$row['Link to Cover']}' /></a>\n"; $output .= "</td>\n"; //Close row if needed if($record_count%$max_columns == 0) { $output .= "</tr>\n"; } } //Close last row if needed if($record_count%$max_columns != 0) { $output .= "</tr>\n"; } ?> <table> <?php echo $output; ?> </table>
  16. Yes, I know. I was only pointing out that using a sub-query to get the records and using an outer query to do a GROUP BY
  17. A simpler solution is to modify the query to return a boolean value for each record to determine if it is earlier or later than the current time rather than converting and comparing each date in PHP (not the speediest process). Plus, you should not replicate code. The two outputs are nearly identical save for the bold format. So, create one code block for the output with a variable to make it bold or not. // Make a MySQL Connection $query = "SELECT name, age, (clock < NOW()) as past FROM staff"; $result = mysql_query($query) or die(mysql_error()); while($row = mysql_fetch_assoc($result)) { $fontWeight = ($row['past']) ? 'bold' : 'normal'; echo "<span style='font-weight:{$fontWeight};'>{$row['name']} - {$row['age']}</span><br />\n"; }
  18. OK guys, no need to be so harsh. We were all beginners at one point. 0xfo7d, there are a few specific problems: 1) You are not using JOINs to relate your tables in the queries. Being able to JOIN tables in your queries is the real power that a relational database gives you. It can be hard to grasp at first. Take a little time and look at a tutorial or two. Never, ever run queries in loops unless you have no other alternative. And that should be very few scenarios. 2) You are using '*' in all the queries even though you don't need all the data. 3) You are querying ALL the records only to get the count. I'll look at the queries a little more to see if I can provide some revised ones to get just the data you need with only one or a few queries (with no loops). But, it's difficult without understanding the schema. EDIT: I'll add a #4: SELECT * FROM (SELECT * FROM `rishum` WHERE `status`!='not_relevant' AND `city`='".$row1["id"]."' AND `rishum_to`='1' AND `aougust_form`='".$_GET["aougust_form"]."' AND `date`>='".$filterByYear."/04/22' AND `date`<='".($filterByYear+1)."/04/21') AS c GROUP BY `talmid_id` This really makes no sense. Why would you run a sub-query only so you can select all the records and do a GROUP BY? You can simply do the GROUP BY in the inner query and not need to make it a sub-query: SELECT * FROM `rishum` WHERE `status`!='not_relevant' AND `city`='".$row1["id"]."' AND `rishum_to`='1' AND `aougust_form`='".$_GET["aougust_form"]."' AND `date`>='".$filterByYear."/04/22' AND `date`<='".($filterByYear+1)."/04/21' GROUP BY `talmid_id`
  19. You need to educate your client that what he/she is asking for provides no real value. You *could* do some things that make it only moderately more difficult to find the actual URL. But, anyone with any knowledge of how this stuff works can easily find that URL. You can't provide something to a user (text, images, video, etc.) and protect it at the same time. It doesn't take advanced hacking skills. Heck, when I want cover art for my MP3 albums I run into sites all the time that try to do stupid things to prevent the images from being copied. [ Note: this is for my personal use and not commercial so this is protected under Fair Use laws] For example they may disable right-click menu. Well, that can be defeated by turning off JavaScript and it also takes away the ability to other useful functions such as printing. Or, they may put a transparent image over the actual image. Again easily worked around. Any image you are viewing in your browser has already downloaded to your machine. All you have to do is go to the cache and find it. But, in those cases I just take the even easier low-tech solution of taking a screenshot on my computer and cropping the image out. You cannot protect the content you are providing the user. Period. End. Of. Story. So, you could make it only moderately more difficult for the user to find the real URL. Just point the URL to one of your pages and have that page do a redirect to the actual URL. So, in the page you provide the user, use a URL such as http://mysite.com/video.php?id=5 Then, create the page video.php and create code to determine the actual URL to redirect to: $videoID = $_GET['id']; if($videoID==5) { header("Location: http://real_url_to_the_page"); } But, this is all pointless because it is very simple to "see" what URLs the browser is loading. Browsers have ways to inspect the requests that are being made, plus there are additional add-ins and 3rd party tools that would make it child's play to find that URL. Because, when the redirect happens, the user's browser still has to make a call to that URL.
  20. Agreed. Some other comments/considerations: Manipulating the user input without their knowledge is generally a bad idea because the result may be something very different than they intended. I've seen situations where code would parse out characters which someone felt should not belong in an input - e.g. removing special characters from a person's name. They could forget to not remove dashes (hyphenated names), apostrophes, or even accented characters. Even if you managed to keep all those I have no clue if there are other characters that may be used in other dialects. As stated above, escaping before storing is a bad approach. You may only be planning to use the value in an HTML output now, but what if you need it for something else in the future? For example, let's say you need a plain-text output or to send the data in a JSON format for a service? You would not want all the characters translated into their HTML entities. And, even though there is a function to reverse HTML entities you cannot guarantee you will get the same original content. Making the conversion when you store the data seems to have the advantage of a one and done approach. I.e. you don't have to worry when writing any output code to escape it correctly. But, that is a lazy approach that will come back to haunt you at some time.
  21. What are you trying to achieve? Are you trying to prevent people from opening the video directly? You can't. Yes there are things you could do to make it a little more difficult, but in the end it is impossible. So, why go to the trouble. People are always asking how they can "protect" their content from being copied. If you are making content publicly available on the internet - it is publicly available.
  22. FYI: You should have logs for your webserver that contain records of every page request.
  23. That is just a parameter passed through the URL. It does not "encrypt" anything. You can pass any number of name/value pairs on the URL It is just data and has no direct functional application. It is what you DO with those values that matters. I have no clue what Google is doing with that "encrypt' value. It could be that the video file is encrypted on their servers and that value is the key to un-encrypt it so the processing page can read the file and send to the user. Or, it may be a key used to track users watching the video. There may be some public information on what Google uses that value for, but I'm not going to do the research. If you do want to encrypt something it really has nothing to do with passing an "encrypt" value on the query string.
  24. Your database should have a different username/password than your credentials for accessing your web host interface. Are you SURE those are the credentials for accessing PHPMyAdmin? For my host, once I access the administration panel for my Databases, there are buttons to launch PHPMyAdmin. When I click those buttons it launches PHPMyAdmin automatically without an additional login. That's not because the username/password for those is the same as I used to get the the admin panel, it is because the credentials are being sent behind the scenes. Check your administration area for your databases and you should see information about the user or users for the database. The password may not be something you can "see". But, you should be able to "edit" the user and change the password.
  25. @Bidyhll, A couple quick critiques: 1) You will do yourself a great favor by making your variable/field names meaningful values. Using 'val1', 'val2', 'val3', etc. may make sense to you now as you are writing the code. But, if you have to come back to the code after a week or more you will waste time trying to figure out what they mean. 2) When iterating through an array use a foreach() loop instead of a for() loop where you try to manipulate a variable to equal the index of each item in the array. In response to mac_gyver's first post here, this is how your original code should have looked more like this: //Start transaction $db->beginTransaction(); //Create prepared statement $stmt = $db->prepare("INSERT INTO scripts (val2, val3,val4) VALUES (:val2, :val3, :val4)"); //Bind the parameters outside the loop using unique variable names $stmt->bindParam(":val2", $val2); $stmt->bindParam(":val3", $val3); $stmt->bindParam(":val4", $val4); foreach($fullarray as $record) { //Set the variables used in the bind statements $val2 = $record[0]; $val2 = $record[1]; $val2 = $record[2]; //Run prepared statement for current record $stmt->execute(); } //Commit the changes $db->commit(); Also, it is possible to create a prepared statement for a variable amount of insert records. That way you don't have to manually escape values based upon the variable type. Example //Start transaction $db->beginTransaction(); //Create query string for prepared statement for variable record count $PLACEHOLDERS = array_fill(0, count($fullarray), "(?, ?, ?)"); $query = "INSERT INTO scripts (val2, val3,val4) VALUES " . implode(", ", $PLACEHOLDERS); //Create the prepared statement $stmt = $db->prepare($query); //Put values into single-dimensional array $VALUES = array(); foreach($fullarray as $value) { $VALUES = array_merge($VALUES, $value); } //Execute the preparesed statement for ALL the values $stmt->execute($VALUES); //Commit the changes $db->commit();
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.