How far is Batagay from Sabetta?
The distance between Sabetta (Sabetta International Airport) and Batagay (Batagay Airport) is 1476 miles / 2375 kilometers / 1282 nautical miles.
Sabetta International Airport – Batagay Airport
Search flights
Distance from Sabetta to Batagay
There are several ways to calculate the distance from Sabetta to Batagay. Here are two standard methods:
Vincenty's formula (applied above)- 1475.505 miles
- 2374.595 kilometers
- 1282.179 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1469.564 miles
- 2365.034 kilometers
- 1277.016 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Sabetta to Batagay?
The estimated flight time from Sabetta International Airport to Batagay Airport is 3 hours and 17 minutes.
What is the time difference between Sabetta and Batagay?
The time difference between Sabetta and Batagay is 5 hours. Batagay is 5 hours ahead of Sabetta.
Flight carbon footprint between Sabetta International Airport (SBT) and Batagay Airport (BQJ)
On average, flying from Sabetta to Batagay generates about 178 kg of CO2 per passenger, and 178 kilograms equals 392 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Sabetta to Batagay
See the map of the shortest flight path between Sabetta International Airport (SBT) and Batagay Airport (BQJ).
Airport information
Origin | Sabetta International Airport |
---|---|
City: | Sabetta |
Country: | Russia |
IATA Code: | SBT |
ICAO Code: | USDA |
Coordinates: | 71°13′9″N, 72°3′7″E |
Destination | Batagay Airport |
---|---|
City: | Batagay |
Country: | Russia |
IATA Code: | BQJ |
ICAO Code: | UEBB |
Coordinates: | 67°38′52″N, 134°41′42″E |