How far is Shihezi from Seletar?
The distance between Seletar (Seletar Airport) and Shihezi (Shihezi Huayuan Airport) is 3148 miles / 5065 kilometers / 2735 nautical miles.
The driving distance from Seletar (XSP) to Shihezi (SHF) is 4484 miles / 7216 kilometers, and travel time by car is about 83 hours 56 minutes.
Seletar Airport – Shihezi Huayuan Airport
Search flights
Distance from Seletar to Shihezi
There are several ways to calculate the distance from Seletar to Shihezi. Here are two standard methods:
Vincenty's formula (applied above)- 3147.514 miles
- 5065.432 kilometers
- 2735.115 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 3157.288 miles
- 5081.163 kilometers
- 2743.608 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Seletar to Shihezi?
The estimated flight time from Seletar Airport to Shihezi Huayuan Airport is 6 hours and 27 minutes.
What is the time difference between Seletar and Shihezi?
The time difference between Seletar and Shihezi is 2 hours. Shihezi is 2 hours behind Seletar.
Flight carbon footprint between Seletar Airport (XSP) and Shihezi Huayuan Airport (SHF)
On average, flying from Seletar to Shihezi generates about 352 kg of CO2 per passenger, and 352 kilograms equals 776 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Seletar to Shihezi
See the map of the shortest flight path between Seletar Airport (XSP) and Shihezi Huayuan Airport (SHF).
Airport information
Origin | Seletar Airport |
---|---|
City: | Seletar |
Country: | Singapore |
IATA Code: | XSP |
ICAO Code: | WSSL |
Coordinates: | 1°25′1″N, 103°52′4″E |
Destination | Shihezi Huayuan Airport |
---|---|
City: | Shihezi |
Country: | China |
IATA Code: | SHF |
ICAO Code: | ZWHZ |
Coordinates: | 44°14′31″N, 85°53′25″E |