Jump to content

Recommended Posts

Hi everyone,

My script inserts a huge amount of rows to a mysql table and just as easy it deletes them, so to optimize the index numbers i'm trying to make it so that when i'm adding new rows it searches for an empty one to add the new one in.

It's a phpbb mod i'm writting. This is the code i'm trying:

for ($i=1; true ;$i++) {
	$sql = "SELECT item_id FROM phpbb_lms_items WHERE item_id='". $i ."'";
	$result = $db->sql_query($sql);
	if (!$db->sql_fetchrow($result)) {
		$sql = "INSERT INTO phpbb_lms_items (item_id, ...) VALUES ('". $i ."', ...")";
		if ( !$db->sql_query($sql) ) {
			message_die(GENERAL_ERROR, $lang['no_game_data'], "", __LINE__, __FILE__, $sql);
		}
		break;
	}
}

 

The error i'm getting is this:

Fatal error: Maximum execution time of 30 seconds exceeded in C:\AppServ\www\forum\db\mysql4.php on line 118

 

I know the search logic works because I tested it just echoing $i instead of inserting the data and it printed the first available row from the database, and I know the part that inserts the data is working because without searching for empty rows it inserts the data correctly. They're just not working together for some reason.

 

Anyone have some advice?

agreed... indefinate break; dependant loops are a icky at best idea...

 

besides... whynot simplify that?

 

$sql="SELECT item_id FROM phpbb_lms_items ORDER BY `id`";
while($result = $db->sql_query($sql)){
$i++;
if (!$db->sql_fetchrow($result)){
  $sql = "INSERT INTO phpbb_lms_items (item_id, ...) VALUES ('". $i ."', ...")";
  if(!$db->sql_query($sql)){
   message_die(GENERAL_ERROR, $lang['no_game_data'], "", __LINE__, __FILE__, $sql);
  }
}
}

kathas, I'm trying this because just running the script myself twice i end up with a table with the first 600 rows empty and the data in the next 600 rows, with ids from 601-1201. With a few users using the script several times a day it would get out of hand fast.

 

taith, I don't understand your code, especially the line:

while($result = $db->sql_query($sql))

how many times will that loop run?

 

Thanks for your concern

Won't "while($result = $db->sql_query($sql))" run as long as the query "SELECT item_id FROM phpbb_lms_items ORDER BY `id`" is succesful? (meaning as long as my table is not empty?)

 

AFAIK, "$db->sql_fetchrow($result)" is the one used to check that it finds rows from the database. If i'm not missing anything, your code will just check the first row untill it's empty and then it will insert the data into the $i row.

 

Of course, i could be wrong, please tell me if i am.

how come you end up with 600 empty lines?

 

when you delete an entry you delete everything you don't empty the fields...

 

so that means that you will have 1,601,602,... and not 1,2(empty),...,601(not empty).

 

Then if your class returns the whole result in arrays per row you should do sth like

 

<?php
$sql="SELECT item_id FROM phpbb_lms_items ORDER BY `id`";
$result = $db->sql_query($sql);
foreach ($result as $line)
{
//check if empty or whatever
}

 

the above code should have the same result with taith's if your class returns the query's result in one array that has all rows...

 

Please explain your empty row problem cause it doesn't make sense to me... The rows don't exist or they exist and they are empty???

Sorry for the confusion, what I meant was that the first 600 rows don't exist, they aren't there with empty fields, they just aren't there.

 

For example, if the database consists of the rows with the next IDs:

1

5

6

7

13

(the others don't exist)

 

I want to insert the next data into 2, 3, 4, 8, 9, 10, 11, 12, 14, ...

It works like this:

- The user has a text file, which lists items and the name of the backpack they're in. This file could be huge (more than a thousand items listed), it comes from the inventory system of a game (a different program, not a browser game).

- The class records the list of backpacks in a table and the items in another table, the items table has a field for the ID of the backpack which it belongs to.

- I want to make it easy for the users to upload their text file again (which could be different from last time) to update their online inventory. So each time it reads a backpack from an uploaded text file and a backpack with the same name exists in the db, it deletes every item from said backpack in the db and records the new items from the text file in the db. This is why I end up with the index numbers for the items table raising pretty fast.

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.