How far is Batagay-Alyta from Sabetta?
The distance between Sabetta (Sabetta International Airport) and Batagay-Alyta (Sakkyryr Airport) is 1379 miles / 2220 kilometers / 1199 nautical miles.
Sabetta International Airport – Sakkyryr Airport
Search flights
Distance from Sabetta to Batagay-Alyta
There are several ways to calculate the distance from Sabetta to Batagay-Alyta. Here are two standard methods:
Vincenty's formula (applied above)- 1379.305 miles
- 2219.777 kilometers
- 1198.584 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1373.749 miles
- 2210.835 kilometers
- 1193.755 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Sabetta to Batagay-Alyta?
The estimated flight time from Sabetta International Airport to Sakkyryr Airport is 3 hours and 6 minutes.
What is the time difference between Sabetta and Batagay-Alyta?
Flight carbon footprint between Sabetta International Airport (SBT) and Sakkyryr Airport (SUK)
On average, flying from Sabetta to Batagay-Alyta generates about 172 kg of CO2 per passenger, and 172 kilograms equals 379 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Sabetta to Batagay-Alyta
See the map of the shortest flight path between Sabetta International Airport (SBT) and Sakkyryr Airport (SUK).
Airport information
Origin | Sabetta International Airport |
---|---|
City: | Sabetta |
Country: | Russia |
IATA Code: | SBT |
ICAO Code: | USDA |
Coordinates: | 71°13′9″N, 72°3′7″E |
Destination | Sakkyryr Airport |
---|---|
City: | Batagay-Alyta |
Country: | Russia |
IATA Code: | SUK |
ICAO Code: | UEBS |
Coordinates: | 67°47′31″N, 130°23′38″E |