DataMassif
Net HTTP

Net HTTP

Downloads the specified page, supports multi-page scraping

Net HTTP

Is a universal scraper that allows you to solve most non-standard tasks. It can be used as a basis for parsing arbitrary content from any website. It allows you to download the page code via a link.

Data collected

  • Content
  • Server response code
  • Server response description
  • Server response headers
  • Proxies used in the request
  • Array with all collected pages

Use Cases

  • Downloading content
  • Downloading images
  • Checking the server response code
  • Checking for HTTPS
  • Checking for redirects
  • Outputting a list of redirect URLs
  • Obtaining page size
  • Collecting meta tags
  • Extracting data from the page source code and/or headers

Similar scrapers

Other tools in the "Content & Backlink Scrapers" category.

HTML ArticleExtractor

HTML ArticleExtractor

Collects articles from web pages: title, content with and without HTML markup

HTML EmailExtractor

HTML EmailExtractor

Scraping email addresses from website pages

HTML LinkExtractor

HTML LinkExtractor

Scraping external and internal links from the specified site can be performed on internal links up to the selected level.

HTML TextExtractor

HTML TextExtractor

Text block scraper, allows you to collect content from arbitrary websites