I need to download only pages that belong to the domain gorgeoussales.com
I tried using:
wget -k --max-redirect=3 --tries=3 --domains=gorgeoussales.com -O /var/www/b.html http://gorgeoussales.com/22530
But the problem is that it is also downloaded external sites watchshop.com/
Any hints? Regards
I'm attempting to send post data through wget using the --post-data parameter, and I'm sure it's right, but it's simply not working.
I'm using --post-data like this:
wget --post-data 'first_name=test' http://www.testtest.com/test.php
What am I doing wrong?
Hello, I need you to this page kijiji.it, it does not download immediately, however will download immediately without a break: I do not understand, why not work: --wait=100 --random-wait
wget --wait=100 --random-wait --user-agent="Mozilla/5.0 (X11; Fedora; Linux x86_64; rv:40.0) Gecko/20100101 Firefox/40.0" -b -k --html-extension --convert-links --max-redirect=10 --tries=0 -O /var/www//a.html www.kijiji.it
I would make sure that this works, every 30 seconds: :
seq 100 | parallel -N0 -j 2 php /usr/share/nginx/html/siti.php
I tried it, so just do not work:
seq 100 watch -n 30 | parallel -N0 -j 2 php /usr/share/nginx/html/siti.php
Could anyone please help?
We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.