Download Google Drive Files using wget

The problem: transfer file between clouds

Files on Google drive can be shared between users, but the default access to the file is via a web browser graphical interface. However, sometimes it may be useful, or even necessary, to access and download a file from a command line, for example downloading the file with the wget utility.

I was recently faced with this dilemma when I was trying a set of command examples that used a shared google drive file. The author of the (Jupyter notebook python) script did not have to worry about transferring the file as the file was already within the author’s Google drive folder. But I needed the file as well to test the notebook.

The script was on the “colab” system on Google (see below,) a web-based free service. To get that file I needed to download it, and the easiest way is to use wget. The trick is that any link to a shared file is assuming that the user will interact with it within a browser, not download it (which would result in some HTML front page to be downloaded.)

(EDIT: See alternate method to upload local file or download remote file below.)

The solution

A quick search led me to this blog page: Download Google Drive Files using wget  by
Anjan Chandra Paudel (May 3, 2019·1 min read – archived 01/22/2021) that had the critical information I needed.

First, the file has to be shared and be less than 100Mb (if more the command will be more complex, see below.)

The trick is to figure out the long-name of the file on Google drive from the shared link. The link I needed was:

from which one can extract the file name as 1AnsoyBESGSYzRvbMQh5-FWJdgtTo_gOj and construct the wget command based on the blog information  and remembering that on a colab Jupyter notebook access to the (linux) system requires a ! placed before wget:

!wget --no-check-certificate '' -O 'Kijij Listings - edited.xlsx'

Thus the file 'Kijij Listings - edited.xlsx' was downloaded on the local, temporary disk, on the colab cloud-based system. The file name contains blanks and therefore had to be placed within quotes.

Original post details

Before the file to be downloaded it is needed to be share publicly.


  1. Select a file that is need to be downloaded and do right click.
  2. Click Share. A dialog box will appear.
  3. Click Advance in the right bottom corner.
  4. Click on the Change.. under who has access.
  5. Make it On- Public on the web.
  6. Click Save button.
  7. Copy the link for sharing…like…
  8. Extrac FILEID part like….from above….1UibyVC_C2hoT_XEw15gPEwPW4yFyJFeOEA

SO for small file run following command on your terminal:

wget --no-check-certificate '' -O FILENAME

In the above command change the FILEID by above id extracted and rename FILENAME for your own simple use.

Large files

For large file run the following command with necessary changes in FILEID and FILENAME:

wget --load-cookies /tmp/cookies.txt "$(wget --quiet --save-cookies /tmp/cookies.txt --keep-session-cookies --no-check-certificate '' -O- | sed -rn 's/.*confirm=([0-9A-Za-z_]+).*/\1\n/p')&id=FILEID" -O FILENAME && rm -rf /tmp/cookies.txt

Edit: Upload local files and download remote file

The purpose of this post was to highlight the use of wget to transfer a file from cloud to cloud. However, if the file is on the local computer it can be uploaded using python code. In this example the file(s) can be chosen with a graphical interface upon activating the code within a colab notebook:

from google.colab import files
uploaded = files.upload()

In the same way, a file computed by the notebook and located within the colab environment can be downloaded on the local computer. For example this code will download file example.csv.

from google.colab import files'example.csv')


Image credits:

A Photoshop blend of two images: