wget
is usually used to download a file or web page via HTTP, so by default running the wget http://www.example.com/myscript.php
would simply create a local file called myscript.php
and it would contain the contents of the script output. But I don't want that -- I want to execute the script and optionally redirect its output somewhere else (to a log file or into an email for reporting purposes). So here is how it's done:
$ wget -O - -q http://www.example.com/myscript.php >> log.txt
According to the wget
man page, the "-O -
" option is used to prevent wget
from saving the file locally and instead simply outputs the result of the request. Also, wget
normally produces it's own output (a progress bar showing the status of the download and some other verbose information) but we don't care about that stuff so we turn it off with the "-q
" option. Lastly, the ">> log.txt
" redirects the output of the script to a local file called log.txt
. This could also be a pipe command to send the output as an email.
There is an incredible amount of power behind wget
and there are a lot of cool things you can use it for besides calling PHP scripts from the command line. Check out this LifeHacker article for a bunch of cool uses.
Are you sure this executes the script? I think it simply output the content of the .php file. not the result..
Hi Yaha,
PHP is a server-side scripting language, which means that the PHP code is executed on the server before the web server returns the output to the person requesting it. In this case, `wget` is like a web browser requesting a PHP page on the web, so running it will not retrieve the contents of the PHP file.
Hi Raam
do you know a way to copy 1gb file from Webspace A to Webspave B with a php file?
does wget do the job?
I put my music files on my Webspace in 1 zip file = 1gb now I want to mirror them to webspace B without downloading and reuploding again
Yes, `wget` should do the trick, but honestly I would avoid using PHP to transfer large files like that. You’re better off using a command line script (BASH or Python) as you may run into memory issues doing the transfer with PHP.
I’ve got a file that I am trying to download via wget. The link can be found on this page:
http://www.tokutek.com/tokumx-ce-download-packages/
The download link is:
http://www.tokutek.com/download.php?download_file=tokumx-1.5.1-1.el6.x86_64.rpm
As you can see, there is a php script that is accepting the `download_file` argument. How can I download the file via wget? I’ve tried the suggestions here and in many other places, but I cannot seem to find a viable solution. The above suggestions will indeed download a file with whatever name I give it after the `>` redirection, but it is just the index.html file in reality. Simply with a different name.
Any ideas?
It looks like the `download.php` script sets a cookie with an authentication token, and then redirects you to the actual download after verifying the token it just set. If you want to script that, you’ll need to store the cookie the script gives you, then somehow retrieve the download by authenticating with the cookie.
You might want to look into using `curl` instead of `wget`, as the former as a lot more options for handling this type of thing.
Put your URL in quotes to process directives to the php…for example:
wget -O – -q “http://domain.example.org/script.php?show=0;format=1”
Thanks for the tip! 🙂
You mention that “>> log.txt” can also be used to send the output to a certain e-mail address. Could you give an example of that? How would I go about sending the output of http://www.example.com/myscript.php to [email protected]?
Instead of
>> log.txt
you would simply pipe the output into the mail command:| mail -s "Subject goes here" [email protected]
.See this page for more examples: http://www.binarytides.com/linux-mail-command-examples/