How far is Jharsuguda from Qianjiang?
The distance between Qianjiang (Qianjiang Wulingshan Airport) and Jharsuguda (Jharsuguda Airport) is 1627 miles / 2619 kilometers / 1414 nautical miles.
The driving distance from Qianjiang (JIQ) to Jharsuguda (JRG) is 2564 miles / 4126 kilometers, and travel time by car is about 48 hours 40 minutes.
Qianjiang Wulingshan Airport – Jharsuguda Airport
Search flights
Distance from Qianjiang to Jharsuguda
There are several ways to calculate the distance from Qianjiang to Jharsuguda. Here are two standard methods:
Vincenty's formula (applied above)- 1627.481 miles
- 2619.176 kilometers
- 1414.242 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1625.555 miles
- 2616.077 kilometers
- 1412.568 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Qianjiang to Jharsuguda?
The estimated flight time from Qianjiang Wulingshan Airport to Jharsuguda Airport is 3 hours and 34 minutes.
What is the time difference between Qianjiang and Jharsuguda?
Flight carbon footprint between Qianjiang Wulingshan Airport (JIQ) and Jharsuguda Airport (JRG)
On average, flying from Qianjiang to Jharsuguda generates about 188 kg of CO2 per passenger, and 188 kilograms equals 414 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Qianjiang to Jharsuguda
See the map of the shortest flight path between Qianjiang Wulingshan Airport (JIQ) and Jharsuguda Airport (JRG).
Airport information
Origin | Qianjiang Wulingshan Airport |
---|---|
City: | Qianjiang |
Country: | China |
IATA Code: | JIQ |
ICAO Code: | ZUQJ |
Coordinates: | 29°30′47″N, 108°49′51″E |
Destination | Jharsuguda Airport |
---|---|
City: | Jharsuguda |
Country: | India |
IATA Code: | JRG |
ICAO Code: | VEJH |
Coordinates: | 21°54′48″N, 84°3′1″E |