Jump to content

roopurt18

Staff Alumni
  • Posts

    3,746
  • Joined

  • Last visited

    Never

Everything posted by roopurt18

  1. You are giving the wrong impression that shorter code is faster, which may or may not be true depending on the situation. (EDIT) netfrugal, before you go hunting through your code trying to optimize it, try dumping the actual query being ran to the browser.  Copy and paste it into phpMyAdmin and see how long it takes for the query to run.  Then add the LIMIT x, y to the end of it and do the same thing.
  2. As long as you insert the questions in the order you want them to appear, you can use the auto incrementing ID column of the questions table to order them upon retrieval.  Not saying this is the best approach though!
  3. The number of text files you end up with is really of no concern as long as there is a file management system.  The first option is the better option in my opinion. What will happen when there are lots of articles for the site and the single text file grows to be very large?  You will have a lot of text parsing to go through to find the specific article you want which will not a fast website make. What I really have to wonder is why aren't you using MySQL?
  4. [quote author=businessman332211 link=topic=110266.msg446371#msg446371 date=1159912937] SELECT datetime, ident, ip, what, url FROM log WHERE ident = '$username' and datetime between '$begindate 00:00:00' and '$enddate 23:59:59' Why do you have the dates there 2 times It has the date, then a number it's better to store all that in a variable and pass it, it's quicker. [/quote] There is no practical advantage to typing: [code] <?php $sql = "SELECT datetime, ident, ip, what, url FROM log WHERE ident = '$username' AND datetime BETWEEN '$begindate 00:00:00' AND '$enddate 23:59:59'"; ?> [/code] or typing: [code] <?php $begindate = $begindate . " 00:00:00"; $enddate = $enddate . " 23:59:59"; $sql = "SELECT datetime, ident, ip, what, url FROM log WHERE ident = '$username' AND datetime BETWEEN '$begindate' AND '$enddate'"; ?> [/code] [quote author=businessman332211 link=topic=110266.msg446371#msg446371 date=1159912937] What you are trying to do is called script optimization Optimize your script, remove useless code, double code, code that redoes it self If you have 150 lines of code, chances are if you try, it could be reduced to 100 lines of code post all of your code, on the slow page. Let me see if I can optimize it for you some. [/quote] Reducing your code from 150 lines to 100 also isn't likely to provide any significant performance gains unless you are dramatically overhauling your current [i]algorithm[/i] for handling the data.
  5. I think it's a sad affair that most institutions are moving to Java as the starting CS language.  Not that there is anything wrong with Java, but you can't really learn to appreciate what some of these more modern languages do for you by not having had to do it yourself in c and assembler. Just my $.02.
  6. If you find that you'll go the route of precompiling your reports, I'd probably set up a cron job to run every morning at like 2 or 3AM.  The purpose of this cron is to update all reports that actually need updating.  Not necessarily every report is going to need to be updated every time vendor info changes, so when the data does change, flag which reports need updating. How does the vendor data get updated?  Do the vendors import it?  Do you import it?
  7. Instead of constantly echo'ing output from PHP, try appending it into a single $Page or $HTML variable.  At the very end of your script echo that single variable and you should be set.
  8. The current set of data that you have, how often is it going to be changed?
  9. Do your tables have indexes used by this particular query? (EDIT)  Can we see the query and the table definition?
  10. Just an analogy for the OP. If 20 people throw their drivers license into a basket and I start picking licenses at random, I can easily give the correct license back to the proper person.  The reason I can do so is because of the photo.  Think of the license and the person as representing separate but related data entities, the photo is what connects them together. A DB should be designed in a similar fashion.  You have related pieces of data linked together in some manner; but you don't store them together in the same table because an article doesn't necessarily have questions.  You can put all your articles in one table, all of your questions in another, and as long as each article has a unique integer ID and each question stores the ID of the article it's for, you can link them together. (EDIT) Typos > Me
  11. [quote author=Daniel0 link=topic=109742.msg445801#msg445801 date=1159854438] [quote author=roopurt18 link=topic=109742.msg445495#msg445495 date=1159815079] You're not seriously going to stick 100 extra columns into your [b]test_questions[/b] table, are you? You're also not going to stick 500 extra columns into your [b]test_options[/b] table? [/quote] Do you mean me? Because there is only 4 fields in my options table. And for each option there is inserted another row. [/quote] Nope, sorry for the scare.  I was looking at the OP's post further down the first page and hadn't yet read any of the replies.
  12. I'm still learning how to design a good DB myself.  I keep hearing that MySQL is designed to handle lots of data and to be fast, but most of the time people bring up numbers like "a few hundred thousand."  I'm really not sure what kind of performance you're going to see if you have millions of records. You say the customer table has 3 million records, is that across all vendors or per vendor? If it's the average per vendor and you have 9k vendors, you might want to give each vendor a separate DB, all of which have the same tables.  Otherwise you'd be looking at billions of records. But seeing as how you want to run queries against all vendors, your life will probably be simpler if everything is in the same DB. Here's my recommendation: Put everything into a single DB, leave indexes off your tables (with the exception of the primary key), and write your site.  Set your site up with a debugging mode that will dump all of your SQL queries as sent to MySQL.  For any sluggish portion of the site that you then come across, take note of the queries and the columns searched against (i.e. Look at the WHERE clauses).  Take note of the most common fields (product name, vendor name, etc.) and go back and use ALTER TABLE statements to add indexes on those columns.  You should see some performance gain that way. As for the particular reports you want to display, such as vendor rating, I'd try to do them in real time first and see what kind of performance you get.  If the performance is tolerable, leave it.  If there's just too much data for a particular report to be done on the fly, consider setting up a cron job to run daily or weekly to precompile the results into a table and then you can just SELECT from that table. I've only been working with PHP and MySQL for a little more than a year and only 6 months professionally; if I'm totally off-base hopefully someone with more experience will chime in.
  13. Also, I loaded the page and had to hit stop.  I didn't find any javascript causing the redirect in the source I received, but that doesn't mean it doesn't exist somewhere at the very bottom of the page.
  14. Open the source in each of your directories a few files at a time and use your editor's "Search in Files" feature for the redirected URL.  If it doesn't turn up look for calls to the function header. Can we rule out .htaccess redirection because the initial page loads?  I don't know enough about web servers to make that call.
  15. cust_id | fname | lname | ...  >> [b]customer table[/b] vendor_id | vname | ... >> [b]vendor table[/b] state_id | name | abbreviation | ... >> [b]state table[/b] prod_id | name | cat_id | ... >> [b]product table[/b] cat_id | name | ... >> [b]product category table[/b] The ... represents any extra information required for each row in that table, such as an address for the customer table.  Given those 4 tables we can start relating your data. rel_id | vendor_id | cust_id >> [b]vendor / customer relationship table[/b] Each row in the table is a unique combination of vendor_id and cust_id. rel_id | vendor_id | state_id >> [b]vendor / state relationship table[/b] Each row in the table is a unique combination of which state a particular vendor is active in. rel_id | vendor_id | prod_id >> [b]vendor / product relationship table[/b] Each row in the table is a unique combination of a product carried by a vendor. purchase_id | vendor_id | cust_id | prod_id | date | cost | ... >> [b]purchase history table[/b] This table will allow you to generate a purchase history of all customers across all vendors for all products. That's just one possible DB setup and possible relating tables.  Remember to keep in mind when creating your DB how the data relates to itself.  Define clearly what a product is, what a customer is, what a vendor is, etc.  The number of tables required will depend on the types of relationships the data has.  Is the relationship between a product and vendor one to one (only that vendor carries that product) or one to many (several vendors carry that product)?  Does each vendor operate in only one state (one to one) or can they operate in several states (one to many)?
  16. You're not seriously going to stick 100 extra columns into your [b]test_questions[/b] table, are you? You're also not going to stick 500 extra columns into your [b]test_options[/b] table? All of those extra columns are declared as VARCHAR(250).  If every single test is going to have 100 questions with 5 choices, that's fine.  But even then you're going to have ridiculously long queries, which you've already stated you won't.  So what you're really going to do is waste [b]a ton[/b] of space. 1) Start with an articles table and give each entry a unique ID, called [b]article_id[/b]. 2) Create a questions table and give each question a unique ID.  In this table you will need a column to link each question to an article; this column will be titled [b]article_id[/b]. Given this setup, you can query the database for all of the questions related to a particular article. "SELECT * FROM articles a, questions q WHERE [b]q.article_id = a.article_id[/b]" Notice the part in bold; it is that unique ID that allows you to link the questions to the article. 3) Now create your answers table giving each answer a unique ID, answer_id.  Each answer will also have a column linking it to at least a question, question_id. Now you can select all of the answers for a given question: "SELECT * FROM questions q, answers a WHERE [b]q.question_id = a.question_id[/b]" If you need to select all answers for an article: "SELECT * FROM questions q, answers an, articles ar WHERE ar.article_id = q.article_id AND q.question_id = an.question_id"
  17. Unless I'm totally blind, when viewing a thread the bread crumbs are only available at the top of the page.  Being a lazy individual, I'd love it if I didn't have to scroll up to find them.  ;D
  18. Pagination will help a lot.  Basically you want to append LIMIT <offset>, <return> to your SQL statement. <offset> is an integer telling MySQL how far through the return set to actually start returning. <return> is the number of records to actually return. Let's say you want to display 25 records per page. Page 1: LIMIT 0, 25 Page 2: LIMIT 25, 25 Page 3: LIMIT 50, 25 Page N: LIMIT 25 * (N - 1), 25 If you want to display Z records per page, the general formula will be: Page N: LIMIT Z * (N - 1), Z Been a while since I did pagination, but I believe this all to be correct.
  19. I have a cron job that fired off sometime yesterday; it's still running and I need to shut it down.  I don't have access to ps so I have no idea what PID it has.
  20. There's a couple methods off the top of my head. 1)  You can use substr to break the date down and reassemble it. 2)  You can use a regexp with grouping and preg_match to fill an array with the date parts and reassemble it. Or maybe there is a built in php function that does this, although I'm not aware of it's existence.
  21. PHP and MySQL are what drive your website.  They're installed on webhost computer which can be running any server OS.  As long as the OS provides automated task scheduling, you can schedule your site to do anything. Try going to http://www.google.com and searching for: crontab cron jobs Look for the pages pointing back at wikipedia as a starting reference.
  22. Your client is probably trying to make the process as user friendly as possible, which is good.  You could try posting a simple walk-through on the site to help novice users with the concept of scanning and uploading an image. When he asks why it can't be done, ask him how happy he'd be if a website took over his computer hardware.
  23. Give your members the ability to ignore future messages from other members, report offensive / inappropriate messages, and automatic filters that can be turned on / off.  At the very least you will want to strip Javascript from any messages submitted. You can develop a mechanism such that if X number of seemingly unrelated members report the same member Y for offenses that will disable Y's ability to send messages for some duration. On your end, I'd want admin controls logging user activity.  This way you can go back and look for a history of offensive messages from a specific IP range which would enable you to file complaints with the offenders ISP. Lastly, the one reason you should need or want to read another user's personal messages is if an offense has been reported.  Otherwise I'd say it's none of your business. (EDIT)  One last idea, when a message is submitted, you might want to check if it matches other messages sent by your members.  It's not foolproof, but you could eliminate people from using multiple accounts to spam the same junk to your users. I bet this initially sounded like a simple task.  ;D
  24. Make sure that whatever you do, the final test is javascript and has all of the answers embedded within the HTML. Those are my favorite online tests.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.