Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

wget?


wget with wildcards is what we use to get biology data. Those are ftp sites, so some experimentation may be necessary to make it work with http...


What? No, it works fine with http out of the box. From the man page:

GNU Wget is a free utility for non-interactive download of files from the Web. It supports HTTP, HTTPS, and FTP protocols, as well as retrieval through HTTP proxies.


I think the GP's uncertainty relates to download URL wildcards with HTTP. Unlike FTP, I don't believe that HTTP servers have to support your request to list all files in a directory.


Thankfully the particular website in question already spits out a file listing for you.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: