How far is Nakashibetsu from Surgut?
The distance between Surgut (Surgut International Airport) and Nakashibetsu (Nakashibetsu Airport) is 3074 miles / 4948 kilometers / 2672 nautical miles.
The driving distance from Surgut (SGC) to Nakashibetsu (SHB) is 6342 miles / 10207 kilometers, and travel time by car is about 126 hours 57 minutes.
Surgut International Airport – Nakashibetsu Airport
Search flights
Distance from Surgut to Nakashibetsu
There are several ways to calculate the distance from Surgut to Nakashibetsu. Here are two standard methods:
Vincenty's formula (applied above)- 3074.442 miles
- 4947.834 kilometers
- 2671.617 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 3065.896 miles
- 4934.081 kilometers
- 2664.191 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Surgut to Nakashibetsu?
The estimated flight time from Surgut International Airport to Nakashibetsu Airport is 6 hours and 19 minutes.
What is the time difference between Surgut and Nakashibetsu?
Flight carbon footprint between Surgut International Airport (SGC) and Nakashibetsu Airport (SHB)
On average, flying from Surgut to Nakashibetsu generates about 343 kg of CO2 per passenger, and 343 kilograms equals 756 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Surgut to Nakashibetsu
See the map of the shortest flight path between Surgut International Airport (SGC) and Nakashibetsu Airport (SHB).
Airport information
Origin | Surgut International Airport |
---|---|
City: | Surgut |
Country: | Russia |
IATA Code: | SGC |
ICAO Code: | USRR |
Coordinates: | 61°20′37″N, 73°24′6″E |
Destination | Nakashibetsu Airport |
---|---|
City: | Nakashibetsu |
Country: | Japan |
IATA Code: | SHB |
ICAO Code: | RJCN |
Coordinates: | 43°34′38″N, 144°57′36″E |