CO2 Bunny is a RESTful API service that analyzes and calculates the carbon footprint of websites based on their data transfer, hosting energy sources, and traffic patterns.
https://bunny.srijit.co
Analyzes the carbon impact of a website's data transfer.
GET /api/impact/data-transfer
Parameter | Type | Description |
---|---|---|
url |
string |
Required. The URL of the website to analyze |
{
"url": "example.com",
"data_transfer_kb": 1234.56,
"energy_used_kwh": 0.123,
"carbon_emissions_grams": 0.456,
"green_rating": "A",
"cleanerThan": "95%"
}
Checks if a website is hosted on green energy servers.
GET /api/impact/energy-source
Parameter | Type | Description |
---|---|---|
url |
string |
Required. The URL of the website to analyze |
{
"url": "example.com",
"green_hosting": true,
"provider": "Green Host Provider",
"carbon_savings_grams": 10.0,
"sustainability_report": "https://example.com/sustainability"
}
Calculates the carbon impact based on website traffic.
GET /api/impact/traffic
Parameter | Type | Description |
---|---|---|
url |
string |
Required. The URL of the website |
annualPageViews |
number |
Required. Estimated annual page views |
{
"url": "example.com",
"annual_page_views": 1000000,
"carbon_per_view_grams": 0.3,
"total_annual_emissions_kg": 300
}
Performs a comprehensive analysis combining all metrics.
POST /api/impact/calculate
{
"url": "example.com",
"annualPageViews": 1000000
}
{
"id": "...",
"url": "example.com",
"dataTransferKB": 1234.56,
"energyUsedKWh": 0.123,
"carbonEmissionsG": 0.456,
"greenHosting": true,
"provider": "Green Host Provider",
"annualPageViews": 1000000,
"carbonPerViewG": 0.3,
"totalAnnualEmissionsKg": 295,
"createdAt": "2024-03-15T12:00:00Z",
"message": "New analysis created"
}
Retrieve historical analyses for a specific URL.
GET /api/analyses
Parameter | Type | Description |
---|---|---|
url |
string |
Required. The URL to fetch analyses for |
[
{
"id": "...",
"url": "example.com",
"dataTransferKB": 1234.56,
"energyUsedKWh": 0.123,
"carbonEmissionsG": 0.456,
"greenHosting": true,
"provider": "Green Host Provider",
"annualPageViews": 1000000,
"carbonPerViewG": 0.3,
"totalAnnualEmissionsKg": 295,
"createdAt": "2024-03-15T12:00:00Z"
}
]
Retrieve the most recent analyses across all URLs.
GET /api/analyses/recent
Parameter | Type | Description |
---|---|---|
limit |
number |
Optional. Number of analyses to return (default: 10, max: 100) |
[
{
"id": "...",
"url": "example.com",
"dataTransferKB": 1234.56,
"energyUsedKWh": 0.123,
"carbonEmissionsG": 0.456,
"greenHosting": true,
"provider": "Green Host Provider",
"annualPageViews": 1000000,
"carbonPerViewG": 0.3,
"totalAnnualEmissionsKg": 295,
"createdAt": "2024-03-15T12:00:00Z"
}
]
The API uses conventional HTTP response codes to indicate the success or failure of requests:
200 OK
: Request successful400 Bad Request
: Invalid parameters500 Internal Server Error
: Server-side error
Error responses follow this format:
{
"error": "Error message description",
"details": "Additional error details (if available)"
}
The API currently does not implement rate limiting. Have fun with it.
Analyses are cached on an M10 cluster Database. Requesting an analysis for the same URL and annual page views contained in the Database will return the cached analysis. Future plans include using MongoDB Charts to create cool analytics. That's why I'm hoarding the data.