TL;DR
The Apify Google Search Scraper extracts organic results, paid ads, People Also Ask, AI overviews, and related queries from any Google search. Free plan gives you $5/month in credits — enough for ~500 SERP queries. No coding needed. Setup takes under 5 minutes.
Why Scrape Google Search Results?
Google is the most valuable public dataset on the planet. The SERP is where your market tells you exactly what it wants, who is winning, and what's missing. Every page-one ranking is a signal — someone figured out what buyers type into the search bar and built the best answer.
Scraping Google search results powers:
- SEO rank tracking: Monitor your domain's position for hundreds of keywords, daily, without paying $300/month for rank trackers
- Competitor research: See exactly which competitors rank for your target keywords and what their snippets say
- Content gap analysis: Find high-intent queries where no strong content exists — your editorial opportunity
- Ad intelligence: Capture which brands run ads on your keywords, what their copy says, and what landing pages they use
- People Also Ask mining: Extract the PAA box for any query to build FAQ content and discover question-based keywords
- AI overview monitoring: Track what Google's AI overviews say about your brand, products, or competitors as Google's AI Mode rolls out
- Lead generation: Build prospect lists from organic result URLs in specific industries or locations
- Market research: Understand what results dominate specific niches — product types, content formats, publishers
Google has no official API for SERP data. The Google Custom Search API returns 10 results per query at $5 per 1,000 queries, strips out ads, and excludes PAA boxes — it's not suitable for serious SEO or research work. Web scraping fills the gap.
Which Google Scraper Should You Use?
Apify hosts several Google-related scrapers. For standard web search results, these are the main options:
| Scraper | Best For | Users | Pricing |
|---|---|---|---|
| Google Search Scraper | Full SERP data: organic, ads, PAA, AI overviews | 500,000+ | Pay-per-event |
| Google Search Results (SERP) Scraper | Fast, lightweight: position, title, URL, snippet | 10,000+ | $0.001/result |
| Google Search Scraper (epctex) | Alternative with slightly different output structure | 8,000+ | $0.002/result |
| Google Trends Scraper | Search volume trend data over time | 15,000+ | Pay-per-event |
Bottom line: Use the main Apify Google Search Scraper for any serious SERP work. It's the most feature-complete option — built by Apify themselves — and handles edge cases like CAPTCHA rotation and proxy management automatically. The lighter SERP scraper works if you only need basic position + URL data and want lower cost.
What Data You Can Extract
The Google Search Scraper captures every data type visible on the SERP:
Organic Results
- Title text
- URL (full canonical)
- Meta description snippet
- Position (1–100+)
- Breadcrumb path
- Sitelinks (when present)
- Review stars and count (rich results)
- Product price (shopping rich results)
Paid Results
- Ad title and description
- Display URL
- Landing page URL
- Ad position (top/bottom)
- Ad extensions shown
- Shopping ads with price and brand
SERP Features
- People Also Ask questions + answers
- Related queries
- Featured snippet content
- AI overview text (AI Mode add-on)
- Local pack results (when triggered)
- Knowledge panel data
Query Metadata
- Search query string
- Country and language code
- Device type (desktop/mobile)
- Google domain used
- Result count estimate
- Pagination page number
Step-by-Step: Scraping Google Search Results
Before You Start
You need one thing: a free Apify account. Signup takes 30 seconds. No credit card required. You get $5 in free credits each month — enough for hundreds of SERP queries.
Step 1: Open the Actor
Go to the Google Search Scraper on Apify Store and click "Try for free." This opens the actor console.
The console has two sections: the input form on the left and live output on the right. You only need to fill in a few fields.
Step 2: Configure Your Queries
In the Search queries field, enter your keywords — one per line. You can add as many as you need in a single run:
EXAMPLE INPUT
best project management software
project management tools for small teams
asana vs monday.com
free project management software 2026
Key configuration settings:
- Country code: Two-letter ISO code (US, GB, DE, AU). Determines which Google index is queried. Default: US.
- Language code: Restricts results to a specific language (en, de, es, fr). Useful for multilingual SEO.
- Results per page: Google returns 10 results by default. Set to 100 (Google's maximum) to get the full first page plus additional results in one request.
- Maximum pages per query: Set to 1 for position data, 3–5 for deeper research into what ranks beyond page one.
- Device type: DESKTOP or MOBILE. Mobile SERPs differ significantly in features and rankings — run both if you're doing a mobile SEO audit.
Step 3: Set Proxy Configuration
Leave proxy settings on the default "Automatic." The actor manages proxy rotation for you using Apify's residential proxy network. Google aggressively blocks datacenter IPs — the actor handles this without any configuration on your end.
If you're scraping a specific country's results, set the Proxy country to match your target country code. This ensures your requests come from IPs in that country, giving you localized results.
Step 4: Run and Download
Click Start. The actor starts immediately. For 10 queries, you'll typically have results in under 60 seconds. Watch the progress in the log panel.
When the run completes, click the Results tab. You'll see a structured dataset. Export options:
- JSON: Best for API integrations and programmatic processing
- CSV: Best for Google Sheets and Excel analysis
- Excel: Pre-formatted spreadsheet ready for sharing
- JSONL: Line-delimited JSON for streaming large datasets
Real JSON Output Example
Here's exactly what a single query result looks like in JSON:
{
"searchQuery": {
"term": "best project management software",
"device": "DESKTOP",
"page": 1,
"type": "SEARCH",
"domain": "google.com",
"countryCode": "US",
"languageCode": "en"
},
"organicResults": [
{
"title": "10 Best Project Management Software for 2026",
"url": "https://www.g2.com/categories/project-management",
"description": "Compare the best project management software tools. Read verified reviews, pricing, and features from real users.",
"position": 1,
"type": "organic"
},
{
"title": "Asana | Work Management Platform",
"url": "https://asana.com/",
"description": "Work anytime, anywhere with Asana. Keep remote and distributed teams, and your entire organization, focused on the goals, projects, and daily tasks.",
"position": 2,
"type": "organic",
"siteLinks": [
{ "title": "Pricing", "url": "https://asana.com/pricing" },
{ "title": "Features", "url": "https://asana.com/features" }
]
}
],
"paidResults": [
{
"title": "Monday.com® Official Site - Best Work Management Platform",
"url": "https://monday.com/",
"displayUrl": "monday.com",
"description": "Plan, track, and manage any type of work. Highly customizable & easy to use.",
"type": "paid"
}
],
"peopleAlsoAsk": [
{
"question": "Which project management software is best for small teams?",
"answer": "For small teams, tools like Trello, Asana (free tier), and ClickUp offer generous free plans..."
},
{
"question": "Is Asana better than Monday?",
"answer": "Asana and Monday are both popular choices, but they serve different needs..."
}
],
"relatedQueries": [
{ "title": "free project management software" },
{ "title": "best project management apps" },
{ "title": "project management tools for developers" }
],
"resultsTotal": 4820000000
}
Cost Breakdown: What Does It Actually Cost?
Apify pricing for the Google Search Scraper uses a pay-per-event model. Costs scale with volume:
| Queries | Approx. Cost | What You Can Do |
|---|---|---|
| 50 queries | ~$0.50 | Quick competitive audit for a single niche |
| 200 queries | ~$2.00 | Full keyword set for one topic cluster |
| 500 queries/day | ~$5.00/day | Daily rank tracking for 500 keywords |
| 5,000 queries/day | ~$30–40/day | Enterprise-scale rank tracking or market research |
| Free plan | $0 (up to $5/month) | ~200–500 queries, enough to build an initial dataset |
Compare that to alternatives:
- Semrush Position Tracking: $119–$449/month for up to 1,000–5,000 keywords. You get a UI but limited flexibility.
- Ahrefs Rank Tracker: From $99/month for 250 keywords tracked weekly.
- Google Custom Search API: $5 per 1,000 queries — 5x more expensive than Apify, with far less data per query.
- Apify: Pay only for what you run. No monthly minimum. Build exactly the pipeline you need.
For a side project or small business tracking 100–200 keywords daily, Apify's cost runs under $15/month — a fraction of any dedicated rank tracking tool.
Use Case 1: Daily Rank Tracking (Automated)
The most common use case: track your site's Google rankings for a set of target keywords, daily, without paying SaaS prices.
Setup
Paste your keyword list into the actor. Set Maximum pages per query to 3 (captures positions 1–30). Then click Schedule in the actor console and set it to run daily at 6 AM.
Each run creates a new dataset. Use the Apify API or a Zapier integration to push results to Google Sheets automatically.
Parsing Position Data
In your spreadsheet, filter organicResults where url contains your domain. The position field gives your current rank. Store results by date. After 30 days, you have a rank trend chart for every keyword.
Use Case 2: Competitive SERP Analysis
You want to understand the competitive landscape for 50 keywords in your niche. Who consistently ranks? What content formats win? Are there featured snippets you could capture?
Step 1: Build Your Keyword List
Export your target keyword set from your keyword tool. For this use case, aim for 50–200 keywords across your topic cluster.
Step 2: Run the Scraper at Depth
Set Maximum pages per query to 2 (captures top 20 results). Run for all keywords in one batch.
Step 3: Analyze the Data
In Excel or Python, count how many times each domain appears in the top 10 across all keywords. Domains that appear in 30%+ of SERPs are your primary competitors. Look at their URL structures and title patterns for the high-ranking pages — this reveals what content formats Google favors in your niche.
Use Case 3: People Also Ask Mining for Content Ideas
People Also Ask boxes are a direct window into what your audience wants to know. Mining PAA systematically turns one keyword into dozens of long-tail content ideas.
How It Works
Run the Google Search Scraper on your 10–20 primary keywords. Export as JSON. Extract all peopleAlsoAsk entries across all queries. You'll typically get 4–8 questions per SERP — from 20 keywords, that's 80–160 questions your audience is actively asking.
Deduplicate and group by theme. Each cluster is a potential FAQ section, blog post, or YouTube video. The questions with the longest answers in PAA boxes tend to have the highest search volume and are most likely to generate featured snippets for your site.
Use Case 4: Ad Intelligence — What Are Competitors Running?
Paid results data reveals your competitors' Google Ads strategy without spending a dollar on ads yourself.
Run the scraper on your highest-value commercial keywords — queries like "best [product type]," "[product] pricing," "[product] vs [competitor]." Extract all paidResults entries.
For each ad, note:
- Headline patterns: Are they leading with price, features, social proof, or urgency?
- Landing page URLs: Do they send ad traffic to the homepage, a dedicated landing page, or a category page?
- Ad extensions: Are they using sitelinks, callouts, or review extensions?
- Ad presence: Which keywords have heavy ad competition (many ads) vs. low competition (no ads)?
Keywords with no ads but strong organic competition are often underserved paid opportunities. Keywords with 4+ ads at the top tell you where commercial intent is highest.
Python Code Example: Automated SERP Analysis
Here's how to trigger a Google SERP scrape via the Apify API and process results in Python:
import requests
import json
API_TOKEN = "your_apify_token"
ACTOR_ID = "apify~google-search-scraper"
# Start the actor run
run_input = {
"queries": [
"best project management software",
"asana alternatives",
"trello vs notion"
],
"maxPagesPerQuery": 1,
"resultsPerPage": 10,
"countryCode": "US",
"languageCode": "en"
}
run_response = requests.post(
f"https://api.apify.com/v2/acts/{ACTOR_ID}/runs",
params={"token": API_TOKEN},
json={"runInput": run_input}
)
run_id = run_response.json()["data"]["id"]
# Wait for completion (poll)
import time
while True:
status = requests.get(
f"https://api.apify.com/v2/actor-runs/{run_id}",
params={"token": API_TOKEN}
).json()["data"]["status"]
if status in ["SUCCEEDED", "FAILED"]:
break
time.sleep(5)
# Fetch results
dataset_id = run_response.json()["data"]["defaultDatasetId"]
results = requests.get(
f"https://api.apify.com/v2/datasets/{dataset_id}/items",
params={"token": API_TOKEN, "format": "json"}
).json()
# Extract rank data for your domain
MY_DOMAIN = "yoursite.com"
for query_result in results:
term = query_result["searchQuery"]["term"]
for result in query_result.get("organicResults", []):
if MY_DOMAIN in result["url"]:
print(f"Query: {term} | Position: {result['position']} | URL: {result['url']}")
This script takes under 2 minutes to run for a batch of 50 keywords and costs roughly $0.50 in Apify credits.
Handling Pagination: Getting Beyond Page One
Most rank tracking only needs page one. But for content gap analysis and market research, you want pages 2–5.
Set maxPagesPerQuery to the number of pages you want. Each page adds approximately 10 organic results and roughly doubles cost. For most research tasks, 3 pages (top 30 results) is the sweet spot between coverage and cost.
Important: Google's results beyond page 3 get noisy. Brand and directory sites fill up positions 21–100. For actionable competitive intelligence, focus on the top 20.
Legal Considerations
Scraping Google's public search results sits in a well-established legal zone. The hiQ v. LinkedIn ruling (2022) affirmed that accessing publicly available web data is not a violation of the Computer Fraud and Abuse Act. Google's own robots.txt permits automated access to search result pages, within rate limits.
Practical guidelines:
- Don't scrape at abusive rates. Thousands of queries per second is unnecessary and gets blocked. Apify's actor paces requests automatically.
- Use the data for your own analysis. Building a competing search engine or reselling raw SERP data violates Google's Terms of Service.
- Don't circumvent security measures. Apify uses residential proxies — these mimic normal user traffic and respect Google's anti-abuse limits.
For standard business use cases — SEO tracking, competitive research, content strategy — you're well within accepted norms. Thousands of companies do this daily.
Frequently Asked Questions
Does the Google Search Scraper handle CAPTCHAs?
Yes. The actor uses Apify's residential proxy network and automatically rotates IPs, user agents, and request timing. CAPTCHAs appear when Google detects bot patterns — the actor is designed to avoid triggering those patterns in the first place. If a CAPTCHA does appear on a run, the actor retries with a different proxy automatically.
Can I scrape Google results for a specific city or region?
Yes. The actor supports UULE (Uniform Location Encoding) parameters, which encode a specific geographic location. You can target results for "Chicago" or even a specific zip code. This is essential for local SEO — national results look completely different from what someone in Austin actually sees. Generate UULE codes using any online UULE encoder, then paste them into the actor's location field.
How do I get AI Overview (SGE) data?
The actor has an optional AI Mode add-on that captures Google's AI-generated overviews. Enable it in the advanced settings. Note: AI overviews don't appear for every query — they're most common for informational queries. The add-on also supports scraping Perplexity and ChatGPT search results in the same run for cross-platform comparison.
What's the difference between the free TikTok scraper and the paid one?
This article covers Google scraping, but if you're asking about Google specifically: all features are available on the free plan. The $5/month credit covers a meaningful volume of queries. There's no feature gate — you pay based on usage, not plan tier.
Can I schedule Google SERP scrapes to run automatically?
Yes. In the Apify console, click Schedule on any actor. You can set it to run daily, weekly, or on a cron expression. Each scheduled run saves results to a new dataset automatically. Combine with the Apify API or Zapier to push fresh results to a Google Sheet or database after each run.
Getting Started
Rank tracking pays for itself quickly. One keyword moving from position 8 to position 3 can double organic traffic on that query. Knowing exactly where you stand — and where your competitors are — turns SEO from guesswork into a managed process.
- Sign up free: Create your Apify account — no credit card, $5 in free monthly credits
- Run your first batch: Paste 20 of your target keywords and run a test scrape
- Explore the data: Download as CSV, open in Google Sheets, look at who's ranking where
- Schedule it: Set up a daily run to start building your rank history database
- Scale from there: Add more keywords, add competitor tracking, add PAA mining
Browse all SEO and SERP scrapers on Apify, read the getting started guide, or jump straight to the Google Search Scraper.