How far is Jharsuguda from Wuxi?
The distance between Wuxi (Sunan Shuofang International Airport) and Jharsuguda (Jharsuguda Airport) is 2332 miles / 3753 kilometers / 2027 nautical miles.
The driving distance from Wuxi (WUX) to Jharsuguda (JRG) is 3361 miles / 5409 kilometers, and travel time by car is about 62 hours 59 minutes.
Sunan Shuofang International Airport – Jharsuguda Airport
Search flights
Distance from Wuxi to Jharsuguda
There are several ways to calculate the distance from Wuxi to Jharsuguda. Here are two standard methods:
Vincenty's formula (applied above)- 2332.147 miles
- 3753.226 kilometers
- 2026.580 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2328.965 miles
- 3748.106 kilometers
- 2023.815 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Wuxi to Jharsuguda?
The estimated flight time from Sunan Shuofang International Airport to Jharsuguda Airport is 4 hours and 54 minutes.
What is the time difference between Wuxi and Jharsuguda?
Flight carbon footprint between Sunan Shuofang International Airport (WUX) and Jharsuguda Airport (JRG)
On average, flying from Wuxi to Jharsuguda generates about 256 kg of CO2 per passenger, and 256 kilograms equals 564 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Wuxi to Jharsuguda
See the map of the shortest flight path between Sunan Shuofang International Airport (WUX) and Jharsuguda Airport (JRG).
Airport information
Origin | Sunan Shuofang International Airport |
---|---|
City: | Wuxi |
Country: | China |
IATA Code: | WUX |
ICAO Code: | ZSWX |
Coordinates: | 31°29′39″N, 120°25′44″E |
Destination | Jharsuguda Airport |
---|---|
City: | Jharsuguda |
Country: | India |
IATA Code: | JRG |
ICAO Code: | VEJH |
Coordinates: | 21°54′48″N, 84°3′1″E |