Jump to content

vineld

Members
  • Posts

    286
  • Joined

  • Last visited

    Never

Posts posted by vineld

  1. That's not entirely true. You CAN use PHP in CSS as you generate it (meaning you don't have a static .css file) but it does not sound like that is what you want to do in this case. I suppose all you want to do is toggle the display depending on what the user does? Then you should simply use javascript.

  2. If they're not too many you could store the information in a javascript array for example. If they're too many (then maybe your solutions isn't that good anyway) you could simply store the current database id and then just select the previous or next entry upon the user's action.

  3. I am experiencing some character encoding  problems. Our system is built in ISO-8859-1 for various reasons. Our database text fields are in our local version of latin-1. So far so good. However, when communicating with other servers we obviously need to send xml data in utf-8 which is where the problems start. What happens is this:

     

    1. We have a page saved in ANSI

     

    2. I fetch text from the database

     

    3. I use utf8_encode(TEXT)

     

    4. The above is sent via xml

     

    However, on the other side the receiver views the message with characters like 'ö' instead of the correct utf8 string. Where am I going wrong here?

  4. Good question, it seems to have something to do with /bin/bash. It is the correct path but it still complains (no such file or directory, bad interpreter). I have chosen the simple solution for now, just adding the cron job directly into crontab. That works but I still would like to know what causes the problems.

  5. I am used to the control panels web hosting companies and therefore I rarely use crontab close to the source. Now I have no clue as to what I am doing wrong.

     

    In /home/DIR/ I have a file called crontab which contains the following:

     

    5 * * * * /home/DIR/public_html/_cron/FILE > /dev/null

    .... 2 more ....

    35 * * * * /home/DIR/public_html/ANOTHER_SITE_DOMAIN/_cron/MY_FILE > /dev/null

     

     

    Then, MY_FILE (no file extension) contains the following lines:

     

    #!/bin/bash

    cd /home/DIR/public_html/ANOTHER_SITE_DOMAIN/_cron/

    /usr/local/bin/php /home/DIR/public_html/ANOTHER_SITE_DOMAIN/_cron/blabla.php

     

    I have no clue where things are going wrong... When I run the final line via SSH it works just fine.

  6. The problem does not seem to be NuSOAP related at all. File_get_contents and curl between the two servers are also slow which means that the problem lies with our server. Does anyone have a clue what could be wrong?

     

    NuSOAP and the web service in questions works just fine when I try it at a web host of my own.

  7. I am experiencing some difficulties using the NuSoap classes to connect to a .NET based web service. We run PHP4 on our server for various reasons so we are not able to use the built-in SOAP functionality of PHP5.

     

    I am able to connect to the web service and retrieve a result from a method. However, it takes forever. Every single request takes EXACTLY 20 seconds. No less, no more at any time.

     

    When I check the debugger, it appears as if it is our server that takes forever to connect. Here is where all time is spent:

     

    2010-02-27 20:31:21.387007 soap_transport_http: connect connection_timeout 0, response_timeout 30, scheme http, host [REMOTE SERVER OF WEB SERVICE], port 80

    2010-02-27 20:31:21.387090 soap_transport_http: calling fsockopen with host [REMOTE SERVER OF WEB SERVICE] connection_timeout 0

    2010-02-27 20:31:41.403328 soap_transport_http: set response timeout to 30

    2010-02-27 20:31:41.403415 soap_transport_http: socket connected

     

    Every other part of the process goes well it seems.

     

    Does anyone have any clue as to what the problem could be? Any tips or hints would be highly appreciated.

  8. I am trying to scan a form search page driven by Microsoft SharePoint using PHP and cURL functions. However, I am only able to fetch the page using no search arguments at all, thereby returning all search results. As soon as I add a county variable it returns an error page without any usable message.

     

    __VIEWSTATE is usually not an obstacle but could it have something to do with the __REQUESTDIGEST field?

     

    Is this impossible or am I missing something?

  9. Could it have to do with permissions on the target side? The request reaches the target files and it produces an output although the cron file does not seem to be able to retrieve the content for some reason.

  10. Yup, the space is there.

     

    I tried changing to using wget instead of php and then it works for some reason... I do however use php for the other cron jobs, although for another domain but that shouldn't make a difference?

  11. I am running cron jobs on a shared server as follows:

     

    */2 * * * *

    /usr/local/bin/php /home/xxxxxx/domains/xxxxxx/public_html/mydir/myfile.php > /dev/null 2>&1

     

    However, the file never seems to run properly. I have other cron jobs running on the same server which work just fine.

     

    The file I am trying to run works as it should when I access it directly so I am not sure at all what to do..., what could be the problem?

  12. The bad security reputation probably comes mostly from the fact that most people who wish to dabble in web programming and have never done any programming of any kind before start out using php since it's relatively easy to learn. That means that they are pretty lost at first and this usually results in poor coding, especially when it comes to security issues. It takes less effort to learn PHP and make everything work but it takes more effort to write secure applications than it does in .NET for example.

  13. I need to protect certain xml feeds so that only specific sites are able to access them. The problem is however that the retrieving sites are hosted on a shared server and therefore the following doesn't seem to work:

     

    AuthName "yadayada"
    AuthType Basic
    <Limit GET POST>
    order deny,allow
    deny from all
    allow from mydomain.com
    </Limit>

     

    If I do this:

     

    AuthName "yadayada"
    AuthType Basic
    <Limit GET POST>
    order deny,allow
    deny from all
    allow from 1.2.3.4
    </Limit>

     

    then everyone on the server will be able to access the xml feeds. It's not very likely that they will find out where they are located or that they would have any interest in them whatsoever but I would still like to fill this security hole.

     

    Is there any way to get around this problem?

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.