The problem: transfer file between clouds
Files on Google drive can be shared between users, but the default access to the file is via a web browser graphical interface. However, sometimes it may be useful, or even necessary, to access and download a file from a command line, for example downloading the file with the
I was recently faced with this dilemma when I was trying a set of command examples that used a shared google drive file. The author of the (Jupyter notebook python) script did not have to worry about transferring the file as the file was already within the author’s Google drive folder. But I needed the file as well to test the notebook.
The script was on the “colab” system on Google (see below,) a web-based free service. To get that file I needed to download it, and the easiest way is to use
wget. The trick is that any link to a shared file is assuming that the user will interact with it within a browser, not download it (which would result in some HTML front page to be downloaded.)
(EDIT: See alternate method to upload local file or download remote file below.)
First, the file has to be shared and be less than 100Mb (if more the command will be more complex, see below.)
The trick is to figure out the long-name of the file on Google drive from the shared link. The link I needed was:
from which one can extract the file name as
1AnsoyBESGSYzRvbMQh5-FWJdgtTo_gOj and construct the
wget command based on the blog information and remembering that on a colab Jupyter notebook access to the (linux) system requires a
! placed before
!wget --no-check-certificate 'https://drive.google.com/uc?export=download&id=1AnsoyBESGSYzRvbMQh5-FWJdgtTo_gOj' -O 'Kijij Listings - edited.xlsx'
Thus the file
'Kijij Listings - edited.xlsx' was downloaded on the local, temporary disk, on the colab cloud-based system. The file name contains blanks and therefore had to be placed within quotes.
Original post details
Before the file to be downloaded it is needed to be share publicly.
- Select a file that is need to be downloaded and do right click.
- Click Share. A dialog box will appear.
- Click Advance in the right bottom corner.
- Click on the Change.. under who has access.
- Make it On- Public on the web.
- Click Save button.
- Copy the link for sharing…like…https://drive.google.com/file/d/1UibyVC_C2hoT_XEw15gPEwPW4yFyJFeOEA/view?usp=sharing
- Extrac FILEID part like….from above….1UibyVC_C2hoT_XEw15gPEwPW4yFyJFeOEA
SO for small file run following command on your terminal:
wget --no-check-certificate 'https://docs.google.com/uc?export=download&id=FILEID' -O FILENAME
In the above command change the FILEID by above id extracted and rename FILENAME for your own simple use.
For large file run the following command with necessary changes in FILEID and FILENAME:
wget --load-cookies /tmp/cookies.txt "https://docs.google.com/uc?export=download&confirm=$(wget --quiet --save-cookies /tmp/cookies.txt --keep-session-cookies --no-check-certificate 'https://docs.google.com/uc?export=download&id=FILEID' -O- | sed -rn 's/.*confirm=([0-9A-Za-z_]+).*/\1\n/p')&id=FILEID" -O FILENAME && rm -rf /tmp/cookies.txt
Edit: Upload local files and download remote file
The purpose of this post was to highlight the use of
wget to transfer a file from cloud to cloud. However, if the file is on the local computer it can be uploaded using python code. In this example the file(s) can be chosen with a graphical interface upon activating the code within a colab notebook:
from google.colab import files
uploaded = files.upload()
In the same way, a file computed by the notebook and located within the colab environment can be downloaded on the local computer. For example this code will download file
from google.colab import files
- Jupyter notebook I was testing:
- Or within colab setting:
- Colab tutorials
- Original post: https://medium.com/@acpanjan/download-google-drive-files-using-wget-3c2c025a8b99
- Alternate post: https://github.com/taewookim/wget_gdrive
A Photoshop blend of two pixabay.com images: