Timeseries
The Timeseries endpoint returns metrics bucketed by day or hour for trend visualization. Perfect for building line charts, identifying traffic patterns, and spotting anomalies. Daily granularity is available on all plans; hourly granularity requires Pro.
GET
/api/v1/sites/{siteId}/timeseries Get time-bucketed metrics for a site
Path Parameters
Path
| Name | Type | Description |
|---|---|---|
siteId required | string | The site ID from the /sites endpoint |
Query Parameters
Query
| Name | Type | Description |
|---|---|---|
start | string Default: 7 days ago | Start date in YYYY-MM-DD format |
end | string Default: today | End date in YYYY-MM-DD format |
granularity | string Default: day | Time bucket size: "day" or "hour" (Pro only) |
Request
Get Daily Timeseries
# Daily metrics (default)
curl -X GET "https://attractos.com/api/v1/sites/site_abc123/timeseries?start=2024-01-01&end=2024-01-31" \
-H "Authorization: Bearer YOUR_API_KEY"
# Hourly metrics (Pro only)
curl -X GET "https://attractos.com/api/v1/sites/site_abc123/timeseries?start=2024-01-15&end=2024-01-15&granularity=hour" \
-H "Authorization: Bearer YOUR_API_KEY"const siteId = 'site_abc123';
const params = new URLSearchParams();
params.set('start', '2024-01-01');
params.set('end', '2024-01-31');
params.set('granularity', 'day');
const url = 'https://attractos.com/api/v1/sites/' + siteId + '/timeseries?' + params;
const response = await fetch(url, {
headers: { 'Authorization': 'Bearer YOUR_API_KEY' },
});
const data = await response.json();
// Format for charting library
const chartData = data.data.map(function(point) {
return {
date: new Date(point.date),
bots: point.bot_visits,
referrals: point.referrals,
conversions: point.conversions,
};
});import requests
import pandas as pd
site_id = 'site_abc123'
response = requests.get(
f'https://attractos.com/api/v1/sites/{site_id}/timeseries',
headers={'Authorization': 'Bearer YOUR_API_KEY'},
params={
'start': '2024-01-01',
'end': '2024-01-31',
'granularity': 'day',
}
)
data = response.json()
# Create DataFrame for analysis
df = pd.DataFrame(data['data'])
df['date'] = pd.to_datetime(df['date'])
df.set_index('date', inplace=True)
print(df.describe()) Response
Daily Granularity
Response
200{
"granularity": "day",
"data": [
{
"date": "2024-01-01",
"bot_visits": 234,
"referrals": 45,
"conversions": 3
},
{
"date": "2024-01-02",
"bot_visits": 289,
"referrals": 52,
"conversions": 5
},
{
"date": "2024-01-03",
"bot_visits": 198,
"referrals": 38,
"conversions": 2
}
]
} Hourly Granularity (Pro)
Response
200{
"granularity": "hour",
"data": [
{
"datetime": "2024-01-15 00:00",
"bot_visits": 12,
"referrals": 2,
"conversions": 0
},
{
"datetime": "2024-01-15 01:00",
"bot_visits": 8,
"referrals": 1,
"conversions": 0
},
{
"datetime": "2024-01-15 02:00",
"bot_visits": 15,
"referrals": 3,
"conversions": 1
}
]
} Response Fields
| Field | Type | Description |
|---|---|---|
granularity | string | "day" or "hour" |
data[].date | string | Date in YYYY-MM-DD format (daily) |
data[].datetime | string | Date and hour in "YYYY-MM-DD HH:00" format (hourly) |
data[].bot_visits | integer | Bot crawls in this period |
data[].referrals | integer | LLM referral clicks in this period |
data[].conversions | integer | Conversions in this period |
Use Cases
- Trend visualization — Build line/area charts showing AI traffic over time
- Pattern detection — Identify when bots typically crawl (time of day, day of week)
- Anomaly detection — Spot unusual spikes in referrals or conversions
- Reporting — Generate weekly/monthly AI traffic reports
Errors
400 MISSING_SITE_ID Site ID not provided in path 403 SITE_NOT_FOUND Site doesn't exist or you don't have access 403 PLAN_REQUIRED Hourly granularity requires Pro plan 429 RATE_LIMITED Too many requests Frequently Asked Questions
Why is hourly granularity Pro-only?
Hourly data requires significantly more database queries and storage. It's useful for real-time monitoring and detailed analysis, which are Pro-tier features.
What time zone are the timestamps in?
All timestamps are in UTC. For daily granularity, the date represents a full 24-hour period in UTC. Adjust for your local time zone in your application.
Are there any missing data points?
Days/hours with zero events are included in the response with all values as 0. This makes it easier to render continuous charts without gaps.