Air Miles Calculator logo

How far is Beijing from Surgut?

The distance between Surgut (Surgut International Airport) and Beijing (Beijing Daxing International Airport) is 2355 miles / 3790 kilometers / 2047 nautical miles.

The driving distance from Surgut (SGC) to Beijing (PKX) is 3714 miles / 5977 kilometers, and travel time by car is about 72 hours 3 minutes.

Surgut International Airport – Beijing Daxing International Airport

Distance arrow
2355
Miles
Distance arrow
3790
Kilometers
Distance arrow
2047
Nautical miles

Search flights

Distance from Surgut to Beijing

There are several ways to calculate the distance from Surgut to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 2355.237 miles
  • 3790.387 kilometers
  • 2046.645 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2350.662 miles
  • 3783.024 kilometers
  • 2042.670 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Surgut to Beijing?

The estimated flight time from Surgut International Airport to Beijing Daxing International Airport is 4 hours and 57 minutes.

Flight carbon footprint between Surgut International Airport (SGC) and Beijing Daxing International Airport (PKX)

On average, flying from Surgut to Beijing generates about 258 kg of CO2 per passenger, and 258 kilograms equals 570 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Surgut to Beijing

See the map of the shortest flight path between Surgut International Airport (SGC) and Beijing Daxing International Airport (PKX).

Airport information

Origin Surgut International Airport
City: Surgut
Country: Russia Flag of Russia
IATA Code: SGC
ICAO Code: USRR
Coordinates: 61°20′37″N, 73°24′6″E
Destination Beijing Daxing International Airport
City: Beijing
Country: China Flag of China
IATA Code: PKX
ICAO Code: ZBAD
Coordinates: 39°30′33″N, 116°24′38″E