Scraping Target.com in 2026: Products, Pricing, and Circle Deals via the RedSky API
Scraping Target.com in 2026: Products, Pricing, and Circle Deals via the RedSky API
I've spent a fair amount of time working with Target's data for a price comparison tool I maintain. The API structure is well-organized once you figure it out, and data quality is solid. This post covers Target's backend, how to pull product data and pricing via the RedSky API, how Circle deals surface in the response, and what proxy setup has worked reliably for me.
Target's Anti-Bot Protections
Target uses a layered defense stack. The most significant piece is Shape Security (acquired by F5), a behavioral analytics platform that analyzes mouse movements, keystroke timing, browser fingerprints, and request patterns. It doesn't block on the first suspicious signal -- it typically lets you make a few requests, then quietly returns empty responses or 403s after making a determination.
Beyond Shape, Akamai handles rate limiting and IP reputation scoring. Akamai's bot manager hard-blocks known datacenter ranges almost immediately -- you'll get a clean 200 on your first request from a fresh datacenter IP, then nothing useful afterward.
The RedSky API doesn't require browser-level fingerprinting if you replicate request headers accurately, but requests need to look like they originate from Target's frontend, including the correct x-api-key and Visitor-ID headers. API keys rotate on no fixed schedule -- I've seen the same key work for weeks, then suddenly fail. The current key is embedded in the page JavaScript and easy to extract with a quick regex against the bundle.
Understanding RedSky API
RedSky is Target's internal API that the entire target.com frontend is built on. The base for product endpoints is:
https://redsky.target.com/redsky_aggregations/v1/
The best way to discover endpoints is browser DevTools -- load a product page, open Network, filter by redsky.target.com, and watch what fires. The two most useful endpoints are web/pdp_client_v1 for full product details and web/product_summary_with_fulfillment_v1 for pricing and availability. Both take a tcin parameter -- Target's internal product ID, visible in every product URL.
Finding the Current API Key
The API key is embedded in Target's JavaScript bundles. To extract the current one:
import requests
import re
def get_target_api_key():
"""Extract the current RedSky API key from Target's JS bundle."""
# First, find the main JS bundle URL
resp = requests.get(
"https://www.target.com/p/-/A-54191097",
headers={"User-Agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36"}
)
# Look for the API key pattern in the page source
match = re.search(r'"apiKey"\s*:\s*"([a-f0-9]{40})"', resp.text)
if match:
return match.group(1)
# Fallback: check the script tags for bundle URLs, then search those
bundle_matches = re.findall(r'src="(https://target.scene7.com/is/content/Target/[^"]+\.js)"', resp.text)
for bundle_url in bundle_matches[:3]:
bundle_resp = requests.get(bundle_url, timeout=10)
key_match = re.search(r'ff[0-9a-f]{38}', bundle_resp.text)
if key_match:
return key_match.group(0)
# Last resort: use the known key (may have rotated)
return "ff457966e64d5e877fdbad070f276d18ecec4a01"
API_KEY = get_target_api_key()
REDSKY_BASE = "https://redsky.target.com/redsky_aggregations/v1"
Fetching Product Data
import httpx
import json
from typing import Optional
def build_headers(tcin: str) -> dict:
"""Build realistic Target.com request headers."""
return {
"User-Agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/124.0.0.0 Safari/537.36",
"Accept": "application/json",
"Accept-Language": "en-US,en;q=0.9",
"Referer": f"https://www.target.com/p/-/A-{tcin}",
"sec-ch-ua": '"Chromium";v="124", "Google Chrome";v="124", "Not-A.Brand";v="99"',
"sec-ch-ua-mobile": "?0",
"sec-ch-ua-platform": '"macOS"',
"sec-fetch-dest": "empty",
"sec-fetch-mode": "cors",
"sec-fetch-site": "same-site",
}
def get_product(tcin: str, store_id: str = "3991", zip_code: str = "10001", state: str = "NY") -> dict:
"""Fetch full product details from Target's RedSky API."""
params = {
"key": API_KEY,
"tcin": tcin,
"is_bot": "false",
"store_id": store_id,
"zip": zip_code,
"state": state,
"channel": "WEB",
"page": f"/p/A-{tcin}",
"pricing_store_id": store_id,
}
with httpx.Client(timeout=15) as client:
resp = client.get(
f"{REDSKY_BASE}/web/pdp_client_v1",
params=params,
headers=build_headers(tcin)
)
resp.raise_for_status()
return resp.json()
def parse_product(data: dict) -> dict:
"""Extract the key fields from a product API response."""
item = data["data"]["product"]["item"]
price = data["data"]["product"].get("price", {})
fulfillment = data["data"]["product"].get("fulfillment", {})
desc = item.get("product_description", {})
ent = item.get("enrichment", {})
return {
"tcin": item.get("tcin"),
"upc": item.get("primary_barcode"),
"title": desc.get("title"),
"brand": item.get("primary_brand", {}).get("name"),
"bullet_points": desc.get("bullet_descriptions", []),
"soft_bullets": desc.get("soft_bullets", {}).get("bullets", []),
"category_path": [n["name"] for n in item.get("product_classification", {}).get("merchandise_type_path", [])],
"dpci": item.get("dpci"),
"tcin_external": item.get("relationship_type"),
"price_current": price.get("current_retail"),
"price_regular": price.get("reg_retail"),
"price_sale": price.get("sale_retail"),
"is_on_sale": price.get("is_current_price_sale", False),
"image_primary": ent.get("images", {}).get("primary_image_url"),
"image_alt_count": len(ent.get("images", {}).get("alternate_image_urls", [])),
"avg_rating": item.get("ratings_and_reviews", {}).get("statistics", {}).get("rating", {}).get("average"),
"review_count": item.get("ratings_and_reviews", {}).get("statistics", {}).get("rating", {}).get("count"),
"in_stock": fulfillment.get("shipping_options", {}).get("availability_status") == "IN_STOCK",
}
# Example usage
data = get_product("54191097")
product = parse_product(data)
print(f"{product['title']}")
print(f"Brand: {product['brand']}")
print(f"Price: ${product['price_current']}")
print(f"Rating: {product['avg_rating']} ({product['review_count']} reviews)")
The response includes title, description, brand, images, bullet points, DPCI, and category hierarchy -- usually 15-30KB per product.
Real-Time Pricing and Store Availability
def get_price_and_availability(tcin: str, store_ids: list[str], zip_code: str = "10001", state: str = "NY") -> dict:
"""Get pricing and store availability for a product."""
params = {
"key": API_KEY,
"tcins": tcin,
"store_ids": ",".join(store_ids),
"zip": zip_code,
"state": state,
"channel": "WEB",
"pricing_store_id": store_ids[0],
"has_store_positions_flap": "true",
"scheduled_delivery_store_id": store_ids[0],
}
with httpx.Client(timeout=15) as client:
resp = client.get(
f"{REDSKY_BASE}/web/product_summary_with_fulfillment_v1",
params=params,
headers=build_headers(tcin)
)
resp.raise_for_status()
return resp.json()
def parse_availability(data: dict) -> list[dict]:
"""Parse store availability from the fulfillment response."""
results = []
for product in data.get("data", {}).get("product_summaries", []):
store_options = product.get("fulfillment", {}).get("store_options", [])
for store in store_options:
in_store = store.get("in_store_only", {})
results.append({
"store_id": store.get("store_id"),
"store_name": store.get("location_name"),
"availability": in_store.get("availability_status"), # IN_STOCK, OUT_OF_STOCK, LIMITED_STOCK
"quantity": in_store.get("available_to_promise_quantity"),
"aisle_location": store.get("fulfillment_purchase_options", {}).get("in_store", {}).get("location", ""),
})
return results
result = get_price_and_availability("54191097", ["3991", "1357", "2924"])
for store in parse_availability(result):
status = store["availability"]
icon = "✓" if status == "IN_STOCK" else ("!" if status == "LIMITED_STOCK" else "✗")
print(f" {icon} Store {store['store_id']} ({store['store_name']}): {status}")
if store.get("aisle_location"):
print(f" Location: {store['aisle_location']}")
availability_status returns IN_STOCK, OUT_OF_STOCK, or LIMITED_STOCK.
Searching for Products
Target's search API returns products matching a query:
def search_products(query: str, store_id: str = "3991", page: int = 1, per_page: int = 24) -> dict:
"""Search Target products by keyword."""
params = {
"key": API_KEY,
"channel": "WEB",
"count": per_page,
"default_purchasability_filter": "true",
"hasOnlyGoodForMeFilter": "false",
"include_sponsored": "true",
"keyword": query,
"offset": (page - 1) * per_page,
"page": f"/s/{query.replace(' ', '+')}",
"platform": "desktop",
"pricing_store_id": store_id,
"spellcheck": "true",
"store_ids": store_id,
"useragent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7)",
"visitor_id": "0192abc123",
}
headers = {
"User-Agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36",
"Referer": f"https://www.target.com/s?searchTerm={query}",
"Accept": "application/json",
}
with httpx.Client(timeout=15) as client:
resp = client.get(
"https://redsky.target.com/redsky_aggregations/v1/web/plp_search_v2",
params=params,
headers=headers
)
resp.raise_for_status()
return resp.json()
def parse_search_results(data: dict) -> list[dict]:
"""Extract product summaries from search results."""
products = []
items = data.get("data", {}).get("search", {}).get("products", [])
for item in items:
products.append({
"tcin": item.get("tcin"),
"title": item.get("item", {}).get("product_description", {}).get("title"),
"brand": item.get("item", {}).get("primary_brand", {}).get("name"),
"price": item.get("price", {}).get("current_retail"),
"price_regular": item.get("price", {}).get("reg_retail"),
"is_sale": item.get("price", {}).get("is_current_price_sale", False),
"avg_rating": item.get("ratings_and_reviews", {}).get("statistics", {}).get("rating", {}).get("average"),
"review_count": item.get("ratings_and_reviews", {}).get("statistics", {}).get("rating", {}).get("count"),
"fulfillment": item.get("fulfillment", {}).get("shipping_options", {}).get("availability_status"),
})
return products
results = search_products("coffee maker")
print(f"Found {len(results)} coffee makers")
for p in results[:5]:
sale_marker = " [SALE]" if p["is_sale"] else ""
print(f" TCIN {p['tcin']}: {p['title'][:60]} -- ${p['price']}{sale_marker}")
Circle Deals and Promotions
Circle promotions are embedded in the product detail response inside a promotions array:
def extract_circle_deals(product_data: dict) -> list[dict]:
"""Extract Target Circle promotions from product data."""
deals = []
promotions = (
product_data.get("data", {})
.get("product", {})
.get("price", {})
.get("formatted_promotions", [])
)
for promo in promotions:
if promo.get("promotion_class") == "CIRCLE":
deals.append({
"label": promo.get("promotion_display_override", ""),
"save_amount": promo.get("saved_amount", {}).get("amount"),
"percent_off": promo.get("percent_off"),
"end_date": promo.get("promotion_end_date"),
"promo_id": promo.get("promotion_id"),
"type": promo.get("type"),
})
elif promo.get("promotion_class") in ("DOLLAR_OFF", "PERCENT_OFF", "CLEARANCE"):
deals.append({
"label": promo.get("free_shipping_threshold", promo.get("promotion_display_override", "")),
"class": promo.get("promotion_class"),
"save_amount": promo.get("saved_amount", {}).get("amount"),
"end_date": promo.get("promotion_end_date"),
})
return deals
data = get_product("54191097")
deals = extract_circle_deals(data)
if deals:
for deal in deals:
print(f" Deal: {deal['label']} (saves ${deal.get('save_amount', 'N/A')}, ends {deal.get('end_date', 'N/A')})")
For categories like household essentials and personal care, you'll find active Circle promotions on a significant portion of items.
Proxy Strategy for Target
Datacenter IPs get flagged fast -- AWS, GCP, and DigitalOcean ranges are typically blocked on the first or second request. Shape's behavioral model also builds a session-level risk score, so even a residential IP making too many requests in a short window will eventually get flagged.
The setup that's worked best is rotating residential proxies with geo-targeting, where each TCIN lookup gets a different IP spread across realistic geographic regions. For this, ThorData's residential proxy network supports state-level targeting and per-request rotation. The geo-targeting matters because Target's pricing and availability is region-specific.
import httpx
import random
PROXY_HOST = "residential.thordata.net"
PROXY_PORT = 10000
PROXY_USER = "your_username"
PROXY_PASS = "your_password"
def get_product_proxied(tcin: str, state: str = "NY", zip_code: str = "10001") -> dict:
"""Fetch product data via ThorData proxy with state-level geo-targeting."""
# Session ID ensures IP consistency within a request sequence
session_id = random.randint(10000, 99999)
proxy_user = f"{PROXY_USER}-state-{state.lower()}-session-{session_id}"
proxy_url = f"http://{proxy_user}:{PROXY_PASS}@{PROXY_HOST}:{PROXY_PORT}"
params = {
"key": API_KEY,
"tcin": tcin,
"channel": "WEB",
"page": f"/p/A-{tcin}",
"state": state,
"zip": zip_code,
"pricing_store_id": "3991",
}
with httpx.Client(proxy=proxy_url, timeout=25) as client:
resp = client.get(
f"{REDSKY_BASE}/web/pdp_client_v1",
params=params,
headers=build_headers(tcin)
)
resp.raise_for_status()
return resp.json()
One note: session-based rotation (holding the same IP for a short burst of requests) works better than per-request rotation. Shape is more suspicious of IPs that appear on exactly one request than ones that look like a real browsing session -- a session ID in the proxy username achieves this.
Building a Deal Tracker
import sqlite3
import json
import time
from datetime import datetime
def init_db():
conn = sqlite3.connect("target_deals.db")
conn.executescript("""
CREATE TABLE IF NOT EXISTS products (
tcin TEXT PRIMARY KEY,
title TEXT,
brand TEXT,
category TEXT,
dpci TEXT
);
CREATE TABLE IF NOT EXISTS price_history (
tcin TEXT,
checked_at TEXT,
price REAL,
price_regular REAL,
is_on_sale INTEGER,
circle_deals TEXT,
availability TEXT,
PRIMARY KEY (tcin, checked_at)
);
CREATE INDEX IF NOT EXISTS idx_prices_tcin ON price_history(tcin);
CREATE INDEX IF NOT EXISTS idx_prices_date ON price_history(checked_at);
""")
conn.commit()
return conn
def check_and_store(tcin: str, conn: sqlite3.Connection):
"""Fetch current data for a TCIN and store in DB, alerting on price drops."""
try:
data = get_product_proxied(tcin)
item = data["data"]["product"]["item"]
price_data = data["data"]["product"].get("price", {})
fulfillment = data["data"]["product"].get("fulfillment", {})
current_price = price_data.get("current_retail")
regular_price = price_data.get("reg_retail")
is_sale = price_data.get("is_current_price_sale", False)
deals = extract_circle_deals(data)
avail = fulfillment.get("shipping_options", {}).get("availability_status")
now = datetime.utcnow().isoformat()
# Update product table
conn.execute(
"INSERT OR REPLACE INTO products (tcin, title, brand, dpci) VALUES (?,?,?,?)",
(
tcin,
item.get("product_description", {}).get("title"),
item.get("primary_brand", {}).get("name"),
item.get("dpci"),
)
)
# Store price snapshot
conn.execute(
"INSERT OR REPLACE INTO price_history VALUES (?,?,?,?,?,?,?)",
(tcin, now, current_price, regular_price, int(is_sale), json.dumps(deals), avail)
)
conn.commit()
# Check for price drop
prev = conn.execute(
"""SELECT price FROM price_history
WHERE tcin=? AND checked_at < ?
ORDER BY checked_at DESC LIMIT 1""",
(tcin, now)
).fetchone()
if prev and current_price and prev[0] and current_price < prev[0] * 0.95: # >5% drop
title = conn.execute("SELECT title FROM products WHERE tcin=?", (tcin,)).fetchone()
print(f"PRICE DROP: {title[0] if title else tcin}")
print(f" ${prev[0]:.2f} -> ${current_price:.2f} ({((current_price/prev[0])-1)*100:+.1f}%)")
if deals:
print(f"CIRCLE DEAL on {tcin}: {deals[0]['label']}")
except Exception as e:
print(f"Error on {tcin}: {e}")
conn = init_db()
tcins_to_monitor = ["54191097", "83768703", "15043875", "77153696", "12345678"]
for tcin in tcins_to_monitor:
check_and_store(tcin, conn)
time.sleep(random.uniform(2, 5))
conn.close()
Refreshing the API Key
When the API key rotates, all your requests start returning 401 or empty data. Here's how to automatically detect and refresh it:
import re
import requests
def refresh_api_key() -> str:
"""Extract the latest RedSky API key from Target's frontend JS."""
# Load a product page to trigger JS bundle loading
resp = requests.get(
"https://www.target.com/c/electronics/-/N-5xt1a",
headers={
"User-Agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36",
"Accept": "text/html,application/xhtml+xml",
},
timeout=20
)
# Pattern matches 40-char hex strings (the API key format)
matches = re.findall(r'(?:apiKey|api_key)["\s:=]+([a-f0-9]{40})', resp.text)
if matches:
return matches[0]
# Try searching the main JS bundle
bundle_match = re.search(r'"(/assets/js/client\.[a-f0-9]+\.js)"', resp.text)
if bundle_match:
bundle_resp = requests.get(
f"https://www.target.com{bundle_match.group(1)}",
timeout=20
)
key_match = re.search(r'[a-f0-9]{40}', bundle_resp.text)
if key_match:
return key_match.group(0)
return None
# Use in your scraper:
def make_request_with_key_refresh(tcin: str, retries: int = 2):
global API_KEY
for attempt in range(retries):
try:
data = get_product(tcin)
return data
except httpx.HTTPStatusError as e:
if e.response.status_code in (401, 403) and attempt < retries - 1:
print("API key may have rotated, refreshing...")
new_key = refresh_api_key()
if new_key:
API_KEY = new_key
print(f"Updated API key: {new_key[:8]}...")
continue
raise
Conclusion
The RedSky API is stable enough to build real tools on, though API key rotation and Shape Security mean you can't set it and forget it. Check the key against the live JavaScript bundle weekly and swap it when it stops working. With a decent residential proxy pool via ThorData and sensible request pacing, you can run this scraper reliably for months.
The Circle deal extraction alone makes this worth building if you're in the deal-hunting or cashback space. Combined with price history tracking, you get a solid early-warning system for Target's promotional cycles -- useful both for personal savings and for building commercial deal-alert products.
If you hit unexplained 403s, check whether the API key has rotated first, then your IP reputation. Those two issues account for the vast majority of failures.