Jump to content

Need help ceating this query..


sean04

Recommended Posts

So for example I have a table as such:

 

PackageID      OfficeID      NumUsers

1                    2333          55

2                    2333          55

 

How would I go about writing a query that would copy that exact information for the Package Table but to a different OfficeID? So after I run the query the Package Table would be like this:

 

PackageID      OfficeID      NumUsers      Name

1                    2333          55                  Example

2                    2333          55                  Example

1                    4100          55                  Example

2                    4100          55                  Example

 

This was my idea:

 


$q1 = mysql_query ('SELECT * FROM tbl_packages WHERE OfficeID = \''.$_GET[OfficeID].'\'');
while ($r1 = mysql_fetch_array($q1))
{
mysql_query('
       INSERT INTO
       tbl_packages
       (
          PackageID,
          OfficeID,
          NumUsers,
          Name
        )
VALUES
(
          \''.$r1[PackageID].'\',
  \''.$_GET[OfficeID].'\',
  \''.$r1[NumUsers].'\',
          \''.$r1[Name].'\',
   )
');

 

Only problem is is that this way would take forever if there was 100 fields in the Packages table... maybe a subquery?

 

Thanks,

Sean

Link to comment
Share on other sites

The basic query would be something like this:

INSERT INTO tbl_packages (PackageID, OfficeID, NumUsers) 
SELECT PackageID, 4100, NumUsers
FROM tbl_packages 
WHERE OfficeID = 2333

 

That would do the insert all at once. You can add any other columns you see fit.

Link to comment
Share on other sites

DavidM: You cannot SELECT 4100. That would be an invalid column name. You cannot do it using a INSERT INTO SELECT with which you cannot specify any new values.

 

Edit: You can specify new values for columns if the record fails duplicate key constraint checks.

 

sean04: You have the right the idea with a loop.

Link to comment
Share on other sites

WRT the original loop - My thoughts are that if you looped through the rows on a table, then created a new row on a table for every row you find, you would pop the RAM on the server, as the table would never run out of rows to write:

 

1. Find a row

2. Write a new row

3. Find the row you have just written

4. Write a new row

5. Find the row you have just written

6. *kablam*

 

You might want to use CREATE TEMPORARY TABLE here?

 

If you created a temporary table before running the script, then ran the script, and for every line you find, copy the line you have found to the temporary table, and write another line to the temporary table showing the new OfficeID, then you would come to the end of the script once the first table had run out of lines.

 

You could then (possibly, there might be another better answer here) run a TRUNCATE on the original table, then output the values of the temporary table in a similar query to the first query -- you could output the values back into the original table.

 

You'd need to do it all in one script, as temporary tables are scrapped once the PHP file is finished parsing. Or else actually create a 'real' holding table for the temp data.

 

Having said all that, there is probably a better, cleaner way to do it... :)

 

Wool

Link to comment
Share on other sites

sean04: Make sure in your inital dataset query you use something like WHERE OfficeID = $_GET['original_OfficeID'] and in your while loop you use $_GET['new_OfficeId']'s value for the INSERT. The way you have it makes an exact duplicate of the records with the specific $_GET['Office_Id'].

 

woolyg: You wouldn't want to do that. The idea, is to create a dataset based on 1 query. Loop through that dataset INSERTING its records into the the same table that initial dataset was created from, making any changes to the columns as needed. You would never hit any new records you've just inserted into the table, because the initial dataset hasn't been updated.

 

 

Link to comment
Share on other sites

DavidM: You cannot SELECT 4100. That would be an invalid column name. You cannot do it using a INSERT INTO SELECT with which you cannot specify any new values.

 

Edit: You can specify new values for columns if the record fails duplicate key constraint checks.

 

sean04: You have the right the idea with a loop.

 

@objNoob The 4100 is a numeric LITERAL.  You most certainly CAN use literal values in a SELECT statement as well as an INSERT ... SELECT ... statement. You can also use string literals as long as you quote them.  I've been doing it for 20 years.

 

@sean04 The most efficient way - especially if the new values are constant for all rows - is to use the INSERT ... SELECT ... statement. One trip to the database to create ALL the rows.

Link to comment
Share on other sites

What I'm wondering is is there a way to update one field in a table while all the other fields in that table are the same as where you are copying it from? From my first example there could be 100 fields that need to be inserted to make a new record but the only field that needs actual changing is OfficeID. I need a query that wouldnt need me to enter all 100 fields to change a single field.

 

Something like this

 

INSERT INTO tbl_packages

SELECT OfficeID, * WHERE OfficeID = 4100

 

I know the above is hard to understand because it doesnt make sense lol but i hope one of you will understand what im trying to do

 

Thanks,

Sean

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.