Jump to content

Recommended Posts

Alright so every time i lose my data and have to build a website I always do this first and every time i have to search for hours just to get the .htaccess file perfect, because apparently no one has ever wanted to do exactly this without being knowledgeable enough to not have to ask the question. I thought I would just make a tutorial in case any beginners want to do the same thing but don't know how. My question is this, is this more efficient and more useful than other possible ways of doing the same thing?

 

.htaccess

RewriteEngine On # turn on the engine
RewriteCond %{REQUEST_URI} !index.php # make sure we aren't rewriting the index page as that would cause massive redirects
RewriteRule ^(.*)\.php$ /index.php?page=$1 [QSA,L] # rewrite every file suffixed with php to index.php?page=

 

Obviously its pretty simple, after this i organize my pages into a database table and pull the data as necessary. to me this makes it really easy to create a theming engine and reduce the size of actual files in your website to 2. configuration files and such can also be stored in the database so all you need is your .htaccess and index.php. this also makes it easier to back-up your website by creating a cron-job which does a mysql-dump of your database and sends it to a secure location.

 

again my question is this, is this more efficient and more useful than other possible ways of doing the same thing?

...to me this makes it really easy to create a theming engine and reduce the size of actual files in your website to ...

The size of the actual file, or the code doesn't matter. The user isn't downloading that code from your website, only the output. You can have a small file containing the following:

 

<?php
for($i = 0;$i <= 1048576;++$i) {
    echo '.';
}

 

Sure, the size of the file on your server is small, but they're still downloading a megabyte of data from your server.

With many visitors on your pages, your database will be under heavy stress. Some caching system would be useful.

 

What I'm most wondering is will the stress that is created on the database be handled better than the stress which would be applied to the file system if done the other way.

 

The size of the actual file, or the code doesn't matter. The user isn't downloading that code from your website, only the output. You can have a small file containing the following:

...

Sure, the size of the file on your server is small, but they're still downloading a megabyte of data from your server.

 

I didn't mean that it reduces the size of individual files but rather to reduce the amount of files needed to be referenced and requested.

 

Another thing that I've realized which isn't necessarily a huge advantage but does make things easier and more dynamic than a huge .htaccess file is that you can alias pages in the database, fo example you could have

 

main.php = welcome.php = home.php = index.php

 

while i haven't yet realized the full usefulness of this feature, i'm sure someone will.

I just did a test with microtime against including a file and getting the same file from the database and this is what I found

 

file system: 0.00078500000000004

database: 0.00054600000000005

 

now i know this isn't as accurate as I'd like because this may not be the same as if 5000 people were looking for something. i wonder how i could find that information.

So if i did it right then from what I gather from the results below it seems as if MySQL performs far better under the stress than the file system itself does.

 

Using MySQL

ly@vacuum:~/Downloads$ ab -n 500 mysite.tld/index.php
This is ApacheBench, Version 2.3 <$Revision: 655654 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking mysite.tld (be patient)
Completed 100 requests
Completed 200 requests
Completed 300 requests
Completed 400 requests
Completed 500 requests
Finished 500 requests


Server Software:        Apache/2.2.14
Server Hostname:        mysite.tld
Server Port:            80

Document Path:          /index.php
Document Length:        448 bytes

Concurrency Level:      1
Time taken for tests:   2.780 seconds
Complete requests:      500
Failed requests:        0
Write errors:           0
Total transferred:      330000 bytes
HTML transferred:       224000 bytes
Requests per second:    179.83 [#/sec] (mean)
Time per request:       5.561 [ms] (mean)
Time per request:       5.561 [ms] (mean, across all concurrent requests)
Transfer rate:          115.91 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.0      0       0
Processing:     5    5   0.7      5      11
Waiting:        0    5   0.7      5       9
Total:          5    5   0.7      5      11

Percentage of the requests served within a certain time (ms)
  50%      5
  66%      6
  75%      6
  80%      6
  90%      6
  95%      7
  98%      7
  99%      8
100%     11 (longest request)

 

Using file_get_contents()

fly@vacuum:~/Downloads$ ab -n 500 mysite.tld/index.php
This is ApacheBench, Version 2.3 <$Revision: 655654 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking mysite.tld (be patient)
Completed 100 requests
Completed 200 requests
Completed 300 requests
Completed 400 requests
Completed 500 requests
Finished 500 requests


Server Software:        Apache/2.2.14
Server Hostname:        mysite.tld
Server Port:            80

Document Path:          /index.php
Document Length:        446 bytes

Concurrency Level:      1
Time taken for tests:   2.741 seconds
Complete requests:      500
Failed requests:        0
Write errors:           0
Total transferred:      329000 bytes
HTML transferred:       223000 bytes
Requests per second:    182.40 [#/sec] (mean)
Time per request:       5.482 [ms] (mean)
Time per request:       5.482 [ms] (mean, across all concurrent requests)
Transfer rate:          117.21 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.0      0       1
Processing:     4    5   0.9      5      12
Waiting:        4    5   0.9      5      12
Total:          4    5   0.9      5      12

Percentage of the requests served within a certain time (ms)
  50%      5
  66%      6
  75%      6
  80%      6
  90%      6
  95%      7
  98%      8
  99%      8
100%     12 (longest request)

Again with 50,000 instead of 500 requests, i see only a slight difference but it still tilts towards MySQL. Depending on how you interpret the data of course.

 

Using MySQL

fly@vacuum:~/Downloads$ ab -n 50000 mysite.tld/index.php
This is ApacheBench, Version 2.3 <$Revision: 655654 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking mysite.tld (be patient)
Completed 5000 requests
Completed 10000 requests
Completed 15000 requests
Completed 20000 requests
Completed 25000 requests
Completed 30000 requests
Completed 35000 requests
Completed 40000 requests
Completed 45000 requests
Completed 50000 requests
Finished 50000 requests


Server Software:        Apache/2.2.14
Server Hostname:        mysite.tld
Server Port:            80

Document Path:          /index.php
Document Length:        448 bytes

Concurrency Level:      1
Time taken for tests:   282.502 seconds
Complete requests:      50000
Failed requests:        0
Write errors:           0
Total transferred:      33000000 bytes
HTML transferred:       22400000 bytes
Requests per second:    176.99 [#/sec] (mean)
Time per request:       5.650 [ms] (mean)
Time per request:       5.650 [ms] (mean, across all concurrent requests)
Transfer rate:          114.08 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.0      0       1
Processing:     5    5   0.8      5      23
Waiting:        0    5   0.8      5      21
Total:          5    6   0.8      5      23
WARNING: The median and mean for the total time are not within a normal deviation
        These results are probably not that reliable.

Percentage of the requests served within a certain time (ms)
  50%      5
  66%      6
  75%      6
  80%      6
  90%      7
  95%      7
  98%      8
  99%      8
100%     23 (longest request)

 

Using file_get_contents()

fly@vacuum:~/Downloads$ ab -n 50000 mysite.tld/index2.php
This is ApacheBench, Version 2.3 <$Revision: 655654 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking mysite.tld (be patient)
Completed 5000 requests
Completed 10000 requests
Completed 15000 requests
Completed 20000 requests
Completed 25000 requests
Completed 30000 requests
Completed 35000 requests
Completed 40000 requests
Completed 45000 requests
Completed 50000 requests
Finished 50000 requests


Server Software:        Apache/2.2.14
Server Hostname:        mysite.tld
Server Port:            80

Document Path:          /index2.php
Document Length:        199 bytes

Concurrency Level:      1
Time taken for tests:   275.940 seconds
Complete requests:      50000
Failed requests:        0
Write errors:           0
Total transferred:      20550000 bytes
HTML transferred:       9950000 bytes
Requests per second:    181.20 [#/sec] (mean)
Time per request:       5.519 [ms] (mean)
Time per request:       5.519 [ms] (mean, across all concurrent requests)
Transfer rate:          72.73 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.0      0       2
Processing:     5    5   0.7      5      24
Waiting:        0    5   0.7      5      24
Total:          5    5   0.7      5      24

Percentage of the requests served within a certain time (ms)
  50%      5
  66%      5
  75%      6
  80%      6
  90%      6
  95%      7
  98%      7
  99%      8
100%     24 (longest request)

Doesn't really change anything except drastically increase the max time

 

using include()

fly@vacuum:~$ ab -n 50000 mysite.tld/index2.php
This is ApacheBench, Version 2.3 <$Revision: 655654 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking mysite.tld (be patient)
Completed 5000 requests
Completed 10000 requests
Completed 15000 requests
Completed 20000 requests
Completed 25000 requests
Completed 30000 requests
Completed 35000 requests
Completed 40000 requests
Completed 45000 requests
Completed 50000 requests
Finished 50000 requests


Server Software:        Apache/2.2.14
Server Hostname:        mysite.tld
Server Port:            80

Document Path:          /index2.php
Document Length:        199 bytes

Concurrency Level:      1
Time taken for tests:   256.327 seconds
Complete requests:      50000
Failed requests:        0
Write errors:           0
Total transferred:      20550000 bytes
HTML transferred:       9950000 bytes
Requests per second:    195.06 [#/sec] (mean)
Time per request:       5.127 [ms] (mean)
Time per request:       5.127 [ms] (mean, across all concurrent requests)
Transfer rate:          78.29 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.0      0       1
Processing:     5    5   0.6      5     120
Waiting:        0    5   0.8      5     120
Total:          5    5   0.6      5     121

Percentage of the requests served within a certain time (ms)
  50%      5
  66%      5
  75%      5
  80%      5
  90%      5
  95%      6
  98%      6
  99%      6
100%    121 (longest request)

functions.php

<?php
function get_page_data($page) {
	global $db_prefix;
	if(is_numeric($page)) {
		$page_id = $page;
		$page_query = mysql_query("SELECT * FROM ${db_prefix}page WHERE page_id='${page_id}'");
		if(mysql_num_rows($page_query) == 1) {
			foreach(mysql_fetch_array($page_query) as $key => $value) {
				$page_data[$key] = stripslashes($value);
			}
			return $page_data;
		} else {
			return false;
		}
	} else {
		$page_location = mysql_real_escape_string($page);
		$page_query = mysql_query("SELECT * FROM ${db_prefix}page WHERE page_location='${page_location}'");
		if(mysql_num_rows($page_query) == 1) {
			foreach(mysql_fetch_array($page_query) as $key => $value) {
				$page_data[$key] = stripslashes($value);
			}
			return $page_data;
		} else {
			$page_query = mysql_query("SELECT * FROM ${db_prefix}page WHERE page_aliases LIKE '%${page_location}%'");
			if(mysql_num_rows($page_query) == 1) {
				foreach(mysql_fetch_array($page_query) as $key => $value) {
					$page_data[$key] = stripslashes($value);
				}
				return $page_data;
			} else {
				return false;
			}
		}
	}
}
?>

 

index.php

<?php ob_start(); ?>
<!DOCTYPE html>
<html>
<?php
	include_once('config.php');
	include_once('functions.php');
	mysql_connect($db_host,$db_user,$db_pass) or die(mysql_error());
	mysql_select_db($db_name) or die(mysql_error());
	isset($_GET['page']) ? $page = $_GET['page'] : $page = 'index';
	$page_data = get_page_data($page);
	if($page_data != false) {
	?>
		<head>
			<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
			<meta name="description" content="<?php print $site_description; ?>">
			<title><?php print $site_title_pre . $page_data['page_title']; ?></title>
		</head>
		<body>
			<p>
				<?php print $page_data['page_body']; ?>
			</p>
		</body>
	<?php
	} else {
	?>
		<head><title>Page Not Found</title></html>
		<body>Error #404: The requested page does not exist.</body>
	<?php
	}
	/*if(mysql_query("INSERT INTO `tcs_page` (`page_title`,`page_location`,`page_body`,`page_attr`,`page_ctime`,`page_mtime`) VALUES ('Home','index','Hello World!','aliases=\"home,main,welcome\"','" . time() . "','" . time() . "')")) {
		print 'OK';
	} else {
		print mysql_error();
	}*/
?>
</html>
<?php
$tidy_data = ob_get_clean();
$tidy_config = array(
	'indent'         => true,
	'wrap'           => none);
$tidy = new tidy;
$tidy->parseString($tidy_data, $tidy_config, 'utf8');
$tidy->cleanRepair();
print $tidy;
?>

 

index2.php

<?php ob_start(); ?>
<!DOCTYPE html>
<html>
<?php
	include_once('config.php');
	include_once('functions.php');
	mysql_connect($db_host,$db_user,$db_pass) or die(mysql_error());
	mysql_select_db($db_name) or die(mysql_error());
	?>
		<head>
			<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
			<meta name="description" content="<?php print $site_description; ?>">
			<title><?php print $site_title_pre . $page_data['page_title']; ?></title>
		</head>
		<body>
			<p>
				<?php include('hello.php');  ?>
				<?php print $page_data['page_body']; ?>
			</p>
		</body>
	<?php
?>
</html>
<?php
$tidy_data = ob_get_clean();
$tidy_config = array(
	'indent'         => true,
	'wrap'           => none);
$tidy = new tidy;
$tidy->parseString($tidy_data, $tidy_config, 'utf8');
$tidy->cleanRepair();
print $tidy;
?>

  • 4 weeks later...
This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.