mastubbs Posted December 30, 2008 Share Posted December 30, 2008 Hi all, Sorry im very new to php so there is prob a very easy answer to this. I am using the code <?php if(file_exists($_GET['id'])){ require($_GET['id']); }else{ echo 'ERROR!! Please contact admin.'; } ?> to load html pages into a php page (eg test.com/index.php?id=page1.html). This all works fine but im trying to make it so that the address is test.com/index.php?id=page1 (ie no .html). I also want to make it so that these html files cannot be accessed unless as includes on the php page. I hear this can be done by saving the html pages as .inc pages but i tried this and i can still access these pages if i type them into browser (eg test.com/page1.inc). Also, i still have to type test.com/index.php?id=page1.inc to get the page, test.com/index.php?id=page1 gives me error. Can anyone help? Thanks in advance, Matt Quote Link to comment Share on other sites More sharing options...
trq Posted December 30, 2008 Share Posted December 30, 2008 If you save your pages as php files you can then place the following code at the top of each page. <?php if (!defined("INCLUDED")) { die();} ?> This along with the following..... <?php define("INCLUDED", TRUE); if (file_exists($_GET['id'] . '.php')) { require($_GET['id'] . ".php"); } else { echo 'ERROR!! Please contact admin.'; } ?> Ought achieve what you want. I would be inclined however to also create an array of valid pages and run a check against this array to make sure the requested page is actually valid. eg; <?php $valid = array('foo','bar','bob'); define("INCLUDED", TRUE); if (file_exists($_GET['id'] . '.php') && in_array($_GET['id']), $valid) { require($_GET['id'] . ".php"); } else { echo 'ERROR!! Please contact admin.'; } ?> Quote Link to comment Share on other sites More sharing options...
WhIteSidE Posted January 1, 2009 Share Posted January 1, 2009 I would be inclined however to also create an array of valid pages and run a check against this array to make sure the requested page is actually valid. This is probably a good idea, however, if you want you index script to be more powerful (i.e. load pages out of a directory without knowing every possible page in advance), then you should at the very least include some sanitation to make sure that somebody doesn't include a path, that is to say '../../secretfilenotinthewebroot' ~ Christopher Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.