Movatterモバイル変換


[0]ホーム

URL:


robotstxt: A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler'Permissions Checker

Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.

Version:0.7.15
Depends:R (≥ 3.0.0)
Imports:stringr (≥ 1.0.0),httr (≥ 1.0.0),spiderbar (≥ 0.2.0),future.apply (≥ 1.0.0),magrittr, utils
Suggests:knitr,rmarkdown,dplyr,testthat,covr,curl
Published:2024-08-29
DOI:10.32614/CRAN.package.robotstxt
Author:Pedro Baltazar [aut, cre], Peter Meissner [aut], Kun Ren [aut, cph] (Author and copyright holder of list_merge.R.), Oliver Keys [ctb] (original release code review), Rich Fitz John [ctb] (original release code review)
Maintainer:Pedro Baltazar <pedrobtz at gmail.com>
BugReports:https://github.com/ropensci/robotstxt/issues
License:MIT + fileLICENSE
URL:https://docs.ropensci.org/robotstxt/,https://github.com/ropensci/robotstxt
NeedsCompilation:no
Materials:NEWS
In views:WebTechnologies
CRAN checks:robotstxt results

Documentation:

Reference manual:robotstxt.html ,robotstxt.pdf
Vignettes:using_robotstxt (source)

Downloads:

Package source: robotstxt_0.7.15.tar.gz
Windows binaries: r-devel:robotstxt_0.7.15.zip, r-release:robotstxt_0.7.15.zip, r-oldrel:robotstxt_0.7.15.zip
macOS binaries: r-release (arm64):robotstxt_0.7.15.tgz, r-oldrel (arm64):robotstxt_0.7.15.tgz, r-release (x86_64):robotstxt_0.7.15.tgz, r-oldrel (x86_64):robotstxt_0.7.15.tgz
Old sources: robotstxt archive

Reverse dependencies:

Reverse imports:BAwiR,polite,ralger,readapra
Reverse suggests:newsanchor,spiderbar,vosonSML,webchem

Linking:

Please use the canonical formhttps://CRAN.R-project.org/package=robotstxtto link to this page.


[8]ページ先頭

©2009-2025 Movatter.jp