How far is Beijing from Sabetta?
The distance between Sabetta (Sabetta International Airport) and Beijing (Beijing Nanyuan Airport) is 2664 miles / 4287 kilometers / 2315 nautical miles.
Sabetta International Airport – Beijing Nanyuan Airport
Search flights
Distance from Sabetta to Beijing
There are several ways to calculate the distance from Sabetta to Beijing. Here are two standard methods:
Vincenty's formula (applied above)- 2663.978 miles
- 4287.257 kilometers
- 2314.934 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2659.058 miles
- 4279.338 kilometers
- 2310.658 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Sabetta to Beijing?
The estimated flight time from Sabetta International Airport to Beijing Nanyuan Airport is 5 hours and 32 minutes.
What is the time difference between Sabetta and Beijing?
The time difference between Sabetta and Beijing is 3 hours. Beijing is 3 hours ahead of Sabetta.
Flight carbon footprint between Sabetta International Airport (SBT) and Beijing Nanyuan Airport (NAY)
On average, flying from Sabetta to Beijing generates about 294 kg of CO2 per passenger, and 294 kilograms equals 649 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Sabetta to Beijing
See the map of the shortest flight path between Sabetta International Airport (SBT) and Beijing Nanyuan Airport (NAY).
Airport information
Origin | Sabetta International Airport |
---|---|
City: | Sabetta |
Country: | Russia |
IATA Code: | SBT |
ICAO Code: | USDA |
Coordinates: | 71°13′9″N, 72°3′7″E |
Destination | Beijing Nanyuan Airport |
---|---|
City: | Beijing |
Country: | China |
IATA Code: | NAY |
ICAO Code: | ZBNY |
Coordinates: | 39°46′58″N, 116°23′16″E |