Pages
The Pages endpoint returns metrics broken down by individual page path. See which pages AI bots crawl most, which get cited and clicked by LLMs, and which drive conversions. Useful for identifying your best AI-performing content.
GET
/api/v1/sites/{siteId}/pages Get page-level metrics for a site
Path Parameters
Path
| Name | Type | Description |
|---|---|---|
siteId required | string | The site ID from the /sites endpoint |
Query Parameters
Query
| Name | Type | Description |
|---|---|---|
start | string Default: 7 days ago | Start date in YYYY-MM-DD format |
end | string Default: today | End date in YYYY-MM-DD format |
limit | integer Default: 50 | Maximum pages to return (1-100) |
sort | string Default: bot_visits | Sort by: bot_visits, referrals, or conversions |
Request
Get Page Metrics
curl -X GET "https://attractos.com/api/v1/sites/site_abc123/pages?start=2024-01-01&end=2024-01-31&sort=referrals&limit=20" \
-H "Authorization: Bearer YOUR_API_KEY"const siteId = 'site_abc123';
const params = new URLSearchParams();
params.set('start', '2024-01-01');
params.set('end', '2024-01-31');
params.set('sort', 'referrals');
params.set('limit', '20');
const url = 'https://attractos.com/api/v1/sites/' + siteId + '/pages?' + params;
const response = await fetch(url, {
headers: { 'Authorization': 'Bearer YOUR_API_KEY' },
});
const data = await response.json();
// Find pages with high crawls but low citations
const underperforming = data.pages.filter(function(p) {
return p.bot_visits > 100 && p.referrals < 5;
});
console.log('Pages to optimize:', underperforming);import requests
site_id = 'site_abc123'
response = requests.get(
f'https://attractos.com/api/v1/sites/{site_id}/pages',
headers={'Authorization': 'Bearer YOUR_API_KEY'},
params={
'start': '2024-01-01',
'end': '2024-01-31',
'sort': 'referrals',
'limit': 20,
}
)
data = response.json()
# Print top performing pages
for page in data['pages'][:10]:
print(f"{page['path']}")
print(f" Bots: {page['bot_visits']}, Refs: {page['referrals']}") Response
Response
200{
"pages": [
{
"path": "/guides/remote-work-setup",
"bot_visits": 523,
"referrals": 89,
"conversions": 12,
"top_bots": ["GPTBot", "PerplexityBot", "ClaudeBot"]
},
{
"path": "/blog/best-tools-2024",
"bot_visits": 412,
"referrals": 67,
"conversions": 8,
"top_bots": ["GPTBot", "ClaudeBot", "Googlebot"]
},
{
"path": "/pricing",
"bot_visits": 234,
"referrals": 45,
"conversions": 23,
"top_bots": ["PerplexityBot", "GPTBot"]
}
]
} Response Fields
| Field | Type | Description |
|---|---|---|
path | string | The page path (e.g., "/blog/post-title") |
bot_visits | integer | Total AI bot crawls on this page |
referrals | integer | Total LLM referral clicks to this page |
conversions | integer | Total conversions attributed to this page |
top_bots | array | Top 3 bots by visit count (bot names only) |
Use Cases
- Content audit — Find which pages AI systems engage with most
- AEO opportunities — High crawls + low referrals = optimization targets
- Conversion analysis — Which content paths lead to conversions
- Bot interest mapping — See what topics each AI is most interested in
Errors
400 MISSING_SITE_ID Site ID not provided in path 403 SITE_NOT_FOUND Site doesn't exist or you don't have access 429 RATE_LIMITED Too many requests Frequently Asked Questions
Which pages should I prioritize for optimization?
Pages with high bot visits but low referrals indicate AI is crawling your content but not citing it. Consider improving these pages' Answer Engine Optimization (AEO) with DirectAnswer blocks and structured data.
What are "top_bots" in the response?
The top 3 AI bots that crawled each page, helping you understand which AI systems are most interested in specific content.
Why might some pages have referrals but no bot visits?
This can happen if bots crawled the page before you installed tracking, or if the AI service cached your content. It's also possible with Cloudflare Logpush not enabled, as client-side can't detect all bots.