Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

Scrapes nft floor prices and additional information from opensea. Used forhttps://nftfloorprice.info

NotificationsYou must be signed in to change notification settings

721tools/opensea-scraper

 
 

Repository files navigation

🎉UPDATE 2021-Nov-3: Opensea officiallyupdated their API. You can get accurate realtime floor prices from this endpoint:https://api.opensea.io/api/v1/collection/{slug}/stats:

constaxios=require("axios");asyncfunctiongetFloorPrice(slug){try{consturl=`https://api.opensea.io/collection/${slug}/stats`;constresponse=awaitaxios.get(url);returnresponse.data.stats.floor_price;}catch(err){console.log(err);returnundefined;}}awaitgetFloorPrice("lostpoets");awaitgetFloorPrice("treeverse");awaitgetFloorPrice("cool-cats-nft");

If you need floor prices, please use the official API (see above 👆👆👆). This scraper still can be used to scrape additional information about offers (tokenId, name, tokenContractAddress and offerUrl) as well as the ranking.

Install

npm install opensea-scraper

Usage

slug is the human readable identifier that opensea uses to identify a collection. It can be extracted from the URL:https://opensea.io/collection/{slug}slug

options is an object with the following keys

  • debug [Boolean] launches chromium locally, omits headless mode (default:false)
  • logs [Boolean]: display logs in the console (default:false)
  • sort [Boolean]: sorts the offers by lowest to highest (default:true)
  • browserInstance [PuppeteerBrowser]: bring your own browser instance for more control
constOpenseaScraper=require("opensea-scraper");// which nft project to scrape?constslug="cool-cats-nft";// optionsconstoptions={debug:false,logs:false,sort:true,browserInstance:undefined,}// get basic info (from the opensea API)constbasicInfo=awaitOpenseaScraper.basicInfo(slug);// get offers from opensea. Each offer includes the floor price, tokenName,// tokenId, tokenContractAddress and offerUrlletresult=awaitOpenseaScraper.offers(slug,options);console.dir(result,{depth:null});// result object contains keys `stats` and `offers`// get offers from opensea using a custom link// Opensea supports encoding filtering in the URL so this method is helpful for getting// a specific asset (for example floor price for a LAND token from the sandbox collection)leturl="https://opensea.io/collection/sandbox?search[sortAscending]=true&search[sortBy]=PRICE&search[stringTraits][0][name]=Type&search[stringTraits][0][values][0]=Land&search[toggles][0]=BUY_NOW";result=awaitOpenseaScraper.offersByUrl(url,options);console.dir(result,{depth:null});// result object contains keys `stats` and `offers`// get offersByScrolling from opensea. This is an alternative method to get the same// data as in the function `offers`, with the only difference that the data is here// scraped actively by scrolling through the page. This method is not as efficient// as the `offers` method, but it can scrape more than 32 offers. You could even scrape// a whole collection with ~10k spots (this is not recommended though).letresultSize=40;// if you need less than 32 offers, please use the function `offers()` insteadresult=awaitOpenseaScraper.offersByScrolling(slug,resultSize,options);console.dir(result,{depth:null});// result object contains keys `stats` and `offers`// get offersByScrollingByUrl from opensea using a custom link instead of the slug// the same logic applies as in `offersByScrolling()`// Opensea supports encoding filtering in the URL so this method is helpful for getting// a specific asset (for example floor price for a LAND token from the sandbox collection)url="https://opensea.io/collection/sandbox?search[sortAscending]=true&search[sortBy]=PRICE&search[stringTraits][0][name]=Type&search[stringTraits][0][values][0]=Land&search[toggles][0]=BUY_NOW";resultSize=40;// if you need less than 32 offers, please use the function `offers()` insteadresult=awaitOpenseaScraper.offersByScrollingByUrl(url,resultSize,options);console.dir(result,{depth:null});// result object contains keys `stats` and `offers`// scrape all slugs, names and ranks from the top collections from the rankings page// "type" is one of the following://   "24h": ranking of last 24 hours: https://opensea.io/rankings?sortBy=one_day_volume//   "7d": ranking of last 7 days: https://opensea.io/rankings?sortBy=seven_day_volume//   "30d": ranking of last 30 days: https://opensea.io/rankings?sortBy=thirty_day_volume//   "total": scrapes all time ranking: https://opensea.io/rankings?sortBy=total_volume// "chain" is one of the following: "ethereum", "matic", "klaytn", "solana"//    if chain is unset, all chains will be selected by defaultconsttype="24h";// possible values: "24h", "7d", "30d", "total"constchain="solana";constranking=awaitOpenseaScraper.rankings(type,options,chain);

Debugging

To investigate an issue turn on logs and debug mode (debug: true andlogs: true):

constresult=awaitOpenseaScraper.offers("treeverse",{debug:true,logs:true});

Bring your own puppeteer

if you want to customize the settings for your puppeteer instance you can add your own puppeteer browser instance in the options.🚧 IMPORTANT: I recommend using stealth plugin as otherwise you most likely won't be able to scrape opensea. If you find a way without using the stealth plugin please report in the form of an issue!

constpuppeteer=require('puppeteer-extra');// add stealth plugin and use defaults (all evasion techniques)constStealthPlugin=require('puppeteer-extra-plugin-stealth');puppeteer.use(StealthPlugin());constmyPuppeteerInstance=awaitpuppeteer.launch(myCustomSettings);constresult=awaitOpenseaScraper.offer("cool-cats-nft",{browserInstance:myPuppeteerInstance});

Demo

npm run demo

Run local console / REPL

To test the functions in an REPL node environment that hasOpenseaScraper service preloaded simply run:

node --experimental-repl-await -i -e"$(< init-dev-env.js)"

I recommend saving an alias:

alias consl='node --experimental-repl-await -i -e "$(< init-dev-env.js)"';

Contribute

Open PR or issue if you would like to have more features added.

Donations 🙏

Thanks for your support!BTC: bc1qq5qn96ahlqjxfxz2n9l20kem8p9nsz5yzz93f7ETH: 0x3e4503720Fb8f4559Ecf64BE792b3100722dE940

nftfloorprice.info 🔔

Simple NFT floor price alerts. Easily track all your NFTs and receive realtime email alerts with:https://nftfloorprice.info

About

Scrapes nft floor prices and additional information from opensea. Used forhttps://nftfloorprice.info

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • JavaScript100.0%

[8]ページ先頭

©2009-2025 Movatter.jp