Skip to main content
All CollectionsGeneral
Getting Started
Getting Started

Finding the right API or Scraper for the data you need

Raunaq Singh avatar
Written by Raunaq Singh
Updated over a month ago

What is Unwrangle

Unwrangle is a scraping API and data extraction tool.

How to use Unwrangle?

You can scrape data with Unwrangle in 2 ways:

  1. APIs - Fetch page-wise data in real-time with one of our APIs

  2. Scrapers - Extract records from multiple pages using a no-code app or an API

APIs FAQs

Is the data stored?

APIs scrape the data being fetched from the selected source in real-time. The data returned is as presented by the publisher at the time of the request. No responses are cached or stored at our end.

What does real-time mean?

You will get the response within 10 seconds for most requests. However, a response may be delayed due network latency issues up to 40-60 seconds.

What is the success rate?

We monitor the APIs 24/7 and make updates as necessary to ensure a high success rate. That being said, due to the nature of scraping, occasionally requests can fail. A 400 response code means invalid input, and a 500 response code means an error on our end. Please consider adding up to 3 retries to your API calls when using the API in your project to handle the case where there is an error.

How do credits work?

Each request costs a certain number of credits. The credit cost varies for each API. The minimum being 1 credit per request. Read the docs for each API to see the request cost.

Scrapers FAQs

What are scrapers?

Scrapers operate asynchronously, meaning they don't provide real-time results. They enable you to extract data from multiple pages and URLs simultaneously, all while running on our servers. Pagination and retries are handled on our end, making scrapers a convenient option for gathering data from several pages when immediate results aren't required.

How to use Scrapers?

Use scrapers with our no-code app:

Login to your account. Navigate to scrapers. Click on Create Job.

Page after navigating to scrapers

Select the scraper or platform you want to scrape data from. Add the required input* for the scraper and submit the form. Using this method you can schedule a job to run at regular intervals if required.

Create jobs with Google Maps Search Scraper

When the job is complete the data can be downloaded in CSV format.

Use scrapers via API:

Get your API key. Make a post request to the following endpoint:

POST https://data.unwrangle.com.com/api/jobs

Add the required input in the body with the following schema:

{
"service_platform": "amazon_reviews",
"url": "https://www.amazon.com/Black-Decker-BPWM09W-Portable-Washer/dp/B0799Q45TT",
"webhook_url": "https://hooks.zapier.com/hooks/catch/19721527/24heo0o/"
}

service_platform and url are required. webhook_url is optional.

After the job is complete you will receive a 200 response with the job id and other details

{
"created": "2024-10-02T06:43:35.279672Z",
"id": 30448,
"url": "https://www.amazon.com/Black-Decker-BPWM09W-Portable-Washer/dp/B0799Q45TT",
"search": null,
"urls": [],
"status": "queued",
"n": null,
"service_platform": "amazon_reviews",
"from_date": null,
"webhook_url": "https://hooks.zapier.com/hooks/catch/19721527/24heo0o/",
"external_id: null,
"is_webhook_sent": false,
"credits_used": 10,
"completed": null,
"lang": null,
"place_id": null,
"zoom": null,
"geo_coordinates": null,
"country_code": null,
"location": null,
}

Query results via an API request or optionally send it to a webhook. More details can be found on individual scraper pages.

GET https://data.unwrangle.com/api/jobs/30448/results/



Did this answer your question?