This articleneeds additional citations forverification. Please helpimprove this article byadding citations to reliable sources. Unsourced material may be challenged and removed. Find sources: "Wget" – news ·newspapers ·books ·scholar ·JSTOR(March 2023) (Learn how and when to remove this message) |
![]() Screenshot of Wget running onUbuntu. | |
Original author(s) | Hrvoje Nikšić |
---|---|
Developer(s) | Giuseppe Scrivano, Tim Rühsen, Darshit Shah |
Initial release | January 1996; 29 years ago (1996-01) |
Stable release | 1.25.0[1] ![]() |
Repository | |
Written in | C |
Platform | Cross-platform |
Type | FTP client /HTTP client |
License | GPL-3.0-or-later[a][2] |
Website | www |
GNU Wget (or justWget, formerlyGeturl, also written as its package name,wget) is acomputer program that retrieves content fromweb servers. It is part of theGNU Project. Its name derives from "World Wide Web" and "get". It supports downloading viaHTTP,HTTPS, andFTP.
Its features include recursive download, conversion of links for offline viewing of local HTML, and support for proxies. It appeared in 1996, coinciding with the boom of popularity of the Web, causing its wide use amongUnix users and distribution with most majorLinux distributions. Wget is written inC, and can be easily installed on any Unix-like system. Wget has been ported toMicrosoft Windows,macOS,OpenVMS,HP-UX,AmigaOS,MorphOS, andSolaris. Since version 1.14, Wget has been able to save its output in the web archiving standardWARC format.[3]
Wget descends from an earlier program named Geturl by the same author,[4] the development of which commenced in late 1995. The name changed to Wget after the author became aware of an earlierAmiga program named GetURL, written by James Burton inAREXX.
Wget filled a gap in the inconsistent web-downloading software available in the mid-1990s. No single program could reliably use bothHTTP andFTP to download files. Existing programs either supported FTP (such asNcFTP and dl) or were written inPerl, which was not yet ubiquitous. While Wget was inspired by features of some of the existing programs, it supported both HTTP and FTP and could be built using only the standard development tools found on every Unix system.
At that time many Unix users struggled behind extremely slow university anddial-upInternet connections, leading to a growing need for a downloading agent that could deal with transient network failures without assistance from the human operator.
Wget has been designed for robustness over slow or unstable network connections. If adownload does not complete due to anetwork problem, Wget will automatically try to continue the download from where it left off, and repeat this until the whole file has been retrieved. It was one of the first clients to make use of the then-newRange
HTTP header to support this feature.
Wget can optionally work like aweb crawler by extracting resourceslinked fromHTMLpages and downloading them in sequence, repeating the processrecursively until all the pages have been downloaded or a maximum recursion depth specified by the user has been reached. The downloaded pages are saved in a directory structure resembling that on the remote server. This "recursive download" enables partial or complete mirroring ofweb sites via HTTP. Links in downloaded HTML pages can be adjusted to point to locally downloaded material foroffline viewing. When performing this kind of automaticmirroring of web sites, Wget supports theRobots Exclusion Standard (unless the option-e robots=off
is used).
Recursive download works withFTP as well, where Wget issues theLIST
command to find which additional files to download, repeating this process for directories and files under the one specified in the topURL. Shell-likewildcards are supported when the download of FTP URLs is requested.
When downloading recursively over eitherHTTP orFTP, Wget can be instructed to inspect thetimestamps of local and remote files, and download only the remote files newer than the corresponding local ones. This allows easy mirroring ofHTTP andFTP sites, but is considered inefficient and more error-prone when compared to programs designed for mirroring from the ground up, such asrsync. On the other hand, Wget does not require special server-side software for this task.
Wget is non-interactive in the sense that, once started, it does not require user interaction and does not need to control aTTY, being able to log its progress to a separate file for later inspection. Users can start Wget andlog off, leaving the program unattended. By contrast, mostgraphical ortext user interfaceweb browsers require the user to remain logged in and to manually restart failed downloads, which can be a great hindrance when transferring a lot of data.
Written in a highly portable style ofC with minimal dependencies on third-party libraries, Wget requires little more than a C compiler and a BSD-like interface toTCP/IP networking.[citation needed] Designed as a Unix program invoked from theUnix shell, the program has been ported to numerous Unix-like environments and systems, includingMicrosoft Windows viaCygwin, andmacOS. It is also available as a nativeMicrosoft Windows program as one of theGnuWin packages.
GNU Wget was written by Hrvoje Nikšić with contributions by many other people, including Dan Harkless, Ian Abbott, and Mauro Tortonesi. Significant contributions are credited in theAUTHORS file included in the distribution, and all remaining ones are documented in thechangelogs, also included with the program. Wget is currently maintained by Giuseppe Scrivano, Tim Rühsen and Darshit Shah.[5]
The copyright to Wget belongs to theFree Software Foundation, whose policy is to require copyright assignments for all non-trivial contributions to GNU software.[6]
GNU Wget is distributed under the terms of theGNU General Public License, version 3 or later, with a specialexception that allows distribution of binarieslinked against theOpenSSL library. The text of the exception follows:[2]
Additional permission under GNU GPL version 3 section 7
If you modify this program, or any covered work, by linking or combining it with the OpenSSL project's OpenSSL library (or a modified version of that library), containing parts covered by the terms of the OpenSSL or SSLeay licenses, the Free Software Foundation grants you additional permission to convey the resulting work. Corresponding Source for a non-source form of such a combination shall include the source code for the parts of OpenSSL used as well as that of the covered work.
It is expected[by whom?] that the exception clause will be removed once Wget is modified to alsolink with theGnuTLS library.
Wget'sdocumentation, in the form of aTexinfo reference manual, is distributed under the terms of theGNU Free Documentation License, version 1.2 or later. Theman page usually distributed on Unix-like systems is automatically generated from a subset of the Texinfo manual and falls under the terms of the same license.
Wget is developed in an open fashion, most of the design decisions typically being discussed on the public mailing list[7] followed by users and developers. Bug reports and patches are relayed to the same list.
The preferred method of contributing to Wget's code and documentation is through source updates in the form of textualpatches generated by thediff utility. Patches intended for inclusion in Wget are submitted to the mailing list[7] where they are reviewed by the maintainers. Patches that pass the maintainers' scrutiny are installed in the sources. Instructions on patch creation as well as style guidelines are outlined on the project's wiki.[8]
The source code can also be tracked via a remoteversion control repository that hosts revision history beginning with the 1.5.3 release. The repository is currently runningGit.[9] Prior to that, the source code had been hosted on (in reverse order):Bazaar,[10]Mercurial,Subversion, and viaCVS.
When a sufficient number of features or bug fixes accumulate during development, Wget is released to the general public via the GNU FTP site and its mirrors. Being entirely run by volunteers, there is no external pressure to issue a release nor are there enforceable release deadlines.
Releases are numbered asversions of the form ofmajor.minor[.revision], such asWget 1.11 orWget 1.8.2. An increase of the major version number represents large and possibly incompatible changes in Wget's behavior or a radical redesign of the code base. An increase of the minor version number designates addition of new features and bug fixes. A new revision indicates a release that, compared to the previous revision, only contains bug fixes. Revision zero is omitted, meaning that for example Wget 1.11 is the same as 1.11.0. Wget does not use theodd-even release number convention popularized by Linux.
Wget makes an appearance in the 2010Columbia Pictures motion picture release,The Social Network. The lead character, a somewhat fictionalized version ofFacebook co-founderMark Zuckerberg, uses Wget to aggregate student photos from variousHarvard University housing-facility directories.
The following releases represent notable milestones in Wget's development. Features listed next to each release are edited for brevity and do not constitute comprehensive information about the release, which is available in theNEWS file distributed with Wget.[4]
Content-Disposition
header, which is often used byCGI scripts to indicate the name of a file for downloading. Security-related improvements were also made to the HTTP authentication code. Micah Cowan took over maintainership of the project.--retry-on-host-error
for more reliability and--accept-regex
,--reject-regex
options for recursive FTP retrievals.Initial release | 26 September 2021; 3 years ago (2021-09-26) |
---|---|
Stable release | 2.2.0[13] ![]() |
Repository | gitlab |
License | GPL-3.0-or-later[14] |
Website | www |
GNU Wget2 2.0.0 was released on 26 September 2021. It is licensed under theGPL-3.0-or-later license, and is wrapped around Libwget which is under theLGPL-3.0-or-later license.[14] It has many improvements in comparison to Wget, particularly, in many cases Wget2 downloads much faster than Wget1.x due to support of the following protocols and technologies:[15]
GWget is a free softwaregraphical user interface for Wget. It is developed by David Sedeño Fernández based on theGNOME software stack. GWget supports all of the main features that Wget does, as well as parallel downloads.[16]
Cliget is an open sourceFirefox addon downloader that usesCurl, Wget and Aria2. It is developed by Zaid Abdulla.[17][18][19]
There exist clones of GNU Wget intended forembedded systems, which are often limited in memory and storage. They support its most basic options, usually limited to downloading.
Wget 1.4.0 [formerly known as Geturl] is an extensive rewrite of Geturl.