Air Miles Calculator logo

How far is Shanghai from Jammu?

The distance between Jammu (Jammu Airport) and Shanghai (Shanghai Hongqiao International Airport) is 2712 miles / 4364 kilometers / 2356 nautical miles.

The driving distance from Jammu (IXJ) to Shanghai (SHA) is 4043 miles / 6506 kilometers, and travel time by car is about 75 hours 54 minutes.

Jammu Airport – Shanghai Hongqiao International Airport

Distance arrow
2712
Miles
Distance arrow
4364
Kilometers
Distance arrow
2356
Nautical miles
Flight time duration
5 h 38 min
Time Difference
2 h 30 min
CO2 emission
300 kg

Search flights

Distance from Jammu to Shanghai

There are several ways to calculate the distance from Jammu to Shanghai. Here are two standard methods:

Vincenty's formula (applied above)
  • 2711.655 miles
  • 4363.986 kilometers
  • 2356.364 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2706.110 miles
  • 4355.061 kilometers
  • 2351.545 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jammu to Shanghai?

The estimated flight time from Jammu Airport to Shanghai Hongqiao International Airport is 5 hours and 38 minutes.

Flight carbon footprint between Jammu Airport (IXJ) and Shanghai Hongqiao International Airport (SHA)

On average, flying from Jammu to Shanghai generates about 300 kg of CO2 per passenger, and 300 kilograms equals 661 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jammu to Shanghai

See the map of the shortest flight path between Jammu Airport (IXJ) and Shanghai Hongqiao International Airport (SHA).

Airport information

Origin Jammu Airport
City: Jammu
Country: India Flag of India
IATA Code: IXJ
ICAO Code: VIJU
Coordinates: 32°41′20″N, 74°50′14″E
Destination Shanghai Hongqiao International Airport
City: Shanghai
Country: China Flag of China
IATA Code: SHA
ICAO Code: ZSSS
Coordinates: 31°11′52″N, 121°20′9″E