Jump to content

matfish

Members
  • Posts

    242
  • Joined

  • Last visited

Posts posted by matfish

  1. Hi,

     

    I have www.website1.com running on a dedicated server, I would like www.website2.com to also point to www.website1.com but not just a web re-direct. I have played around with the VirtualHosts on the machine but when I go to www.website2.com it asks me to download the file?

     

    Any ideas?

     

    Many thanks

  2. Thanks for the reply, I have already put a robots.txt file in place and I know that crawlers are supposed to abide by these rules set - however in the past I have still found blog related pages on Google/Yahoo listings before it's offical launch.

     

    Which is why I wanted something a bit more secure.

     

    Many thanks

  3. Sorry - didn't know which section to post this in...

     

    Is there a way to allow the root index.php file to be displayed to all, however making all other directories password protected without putting a .htaccess file in every directory?

     

    The idea being I would like a holding page to be present on the root of the domain, but start development beyond that which would be password protected?

     

    Many thanks

  4. Hi, I'm trying the CSS Validation website to validate my CSS (of course) but I have the following "warnings" not "errors" as such.

     

    94  	 .topHeader .breadcrumb .where_are_you ul li  	You have no background-color set (or background-color is set to transparent) but you have set a color. Make sure that cascading of colors keeps the text reasonably legible.
    102 	.topHeader .breadcrumb .where_are_you ul li a 	You have no background-color set (or background-color is set to transparent) but you have set a color. Make sure that cascading of colors keeps the text reasonably legible.

     

    I understand it is requiring a background color incase a user have pre-defined web browser settings which wil over-ride my own. However I have a gradient background therefore I do not want to define a background color as it ruins the effect.

     

    Is there a way to overcome this issue or shall I just ignore as its only a "warning" and not an "error".

     

    Many thanks

  5. Hi there,

     

    I have some huge error and access logs. I use Plesk, it had a delete option for the access logs after I download them. I guess they will be created again once downloaded?

     

    The error logs do not have a delete option, If I locate these on the server and delete, will these be re-created also?

     

    Many thanks

  6. Hi there,

     

    I have some documents uploaded to a directory. I'm trying an File Manager script but I'm unable to delete files. This being because the files I upload via FTP need to be chmod to 666 then it works. So I wrote a script to change all the files permissions in PHP but it doesnt work.

     

    If I look on the dedibox the chown permissions are: ftp_user:plesk but if I change to ftp_user:apache it doesnt work. If I change it to apache:apache - the online form works but then I do not have access via FTP.

     

    Any ideas what the correct chown should be for ftp_user and website use?

     

    Many thanks

  7. I've tried chown admin:admin / admin:root / root:root / and still its hitting the .sh script but not actually running it. As a test the .sh script just "mkdir testdirectry" and... nothing!

     

    It doesnt have to run as root or admin as long as it runs! When I logged into ssh using root it was fine. When I logged into ssh as admin there was a permissions error - but I fixed it and it runs from ssh as admin now - but nothing when using cron?

     

    How do I find out what cron is running as?

     

    Any other ideas? It's doing my head in!

     

    Thanks for your help

  8. Hi there,

     

    I thought I understood the above but when I tried it still did not work.

     

    I logged in via Plesk as "admin" chose the domain "123.com" hit "Crontab" and set up the crontab under the system user "123_user".

     

    However the .sh script was created as "root" directly via SSH.

     

    Any ideas from the above the chown permissions should be? I can still see that the script is being hit by cron, but nothing's happening.

     

    Many thanks for your help

  9. Hi there, forget about the above issue - It seems to be an issue on PHP script.

     

    I do have another issue through and you kind of answered it above but still need a bit of guidence.

     

    I use Plesk/RedHat5 and set up a crontab to run a .sh script on the server. I can see that ls -lut * in the shell's directory that the crontab is hitting the script at the correct time, but the .sh script is not doing anything. If I run the script as root in ssh it works fine.

     

    You mentioned that cron is probably not running cron as root. Do I have to "chown user:user *" the files to something else as I'm not sure who it would be running as?

     

    Many thanks

     

  10. Hi there,

     

    I have a .php script which outputs some stats and emails them to me. I then have a .sh script that then curl's the php script on the webspace. And then a cron which invokes the .sh script.

     

    If I log in via ssh as root and run the script or run the php script manually - all works fine. But if I let the cron invoke the script then I get the email but there are no stats attached?

     

    If this an issue with cron or php?

     

    Many thanks

  11. Hi there, thanks for all the replies. As I said I'm bit of a newbie - only know a few things on RedHat to get me through it.

     

    The idea is to backup the database and then backup the web content into a directory and renamed to the date for example: 2009_03_25.sql and 2009_03_25.tar on a nightly or weekly basis.

     

    I was then looking for a script to run to delete some of the older files, thinking of trying to keep the last 14 days worth everytime the "clean up" script runs?

     

    I will give your code a go and see what happens, the last time I managed to .tar up every individual file on the server!

     

    Any ideas on the code .sh to run which will delete files less than 10 days old?

  12. I should really post another topic but if you have any insight to the next question that would be great.

     

    Is there a .sh script to back up each table individually into a directory then .zip/.tar it up. Then if I need a specific table I can look in the .tar rather than the whole 300Mb+ file?

     

    Many thanks

  13. I have no idea what /u01 is! Its a dedicated server set up by a hosting company and we are just collating our websites onto one server (I'm no systems administrator though!)

     

    So really, I could put our (very large) database backups in a directory on /home? This is different from the web content?

     

    Any idea how I can find out what /u01 is? or how I get to it?

     

    Thanks again

  14. Hi Kids,

     

    Simple question for you lot but not so much for a newbie like me...

     

    What does the following actually state:

    Filesystem            Size  Used Avail Use% Mounted on
    /dev/sda2             7.6G  1.8G  5.5G  25% /
    /dev/sda1             190M   25M  156M  14% /boot
    tmpfs                1014M     0 1014M   0% /dev/shm
    /dev/sda5             7.6G  387M  6.9G   6% /opt
    /dev/sda7              46G  4.0G   40G  10% /u01
    /dev/sda6             3.8G  1.4G  2.3G  38% /var
    

     

    What location is my databases stored on? What location is my webdocs stored on? Oh I know that one, its /var/www/vhosts/...

     

    But where is safe to put nightly database backups so it doesnt fill up the partition where web file are stored? (I've done that before and the website went down all weekend!)

     

    Any help and advice would be appreciated.

     

    Many thanks

  15. When I update the Show Process List the time just keeps going up, its not closing any connections even though I close them at the end of the query?

     

    Full Texts  	 Id  	 User  	 Host  	 db  	 Command  	 Time  	 State  	 Info
    Kill 	989 	website 	localhost 	DATABASE 	Sleep 	36 	  	NULL
    Kill 	998 	website 	localhost 	DATABASE 	Sleep 	7 	  	NULL
    Kill 	999 	website 	localhost 	DATABASE 	Sleep 	4 	  	NULL
    Kill 	1000 	website 	localhost 	DATABASE 	Sleep 	4 	  	NULL
    Kill 	1001 	website 	localhost 	DATABASE 	Sleep 	4 	  	NULL
    Kill 	1010 	website 	localhost 	DATABASE 	Query 	0 	NULL 	SHOW PROCESSLIST

     

     

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.