Air Miles Calculator logo

How far is Shanghai from Ban Houei?

The distance between Ban Houei (Ban Huoeisay Airport) and Shanghai (Shanghai Hongqiao International Airport) is 1501 miles / 2416 kilometers / 1304 nautical miles.

The driving distance from Ban Houei (HOE) to Shanghai (SHA) is 1980 miles / 3186 kilometers, and travel time by car is about 36 hours 38 minutes.

Ban Huoeisay Airport – Shanghai Hongqiao International Airport

Distance arrow
1501
Miles
Distance arrow
2416
Kilometers
Distance arrow
1304
Nautical miles

Search flights

Distance from Ban Houei to Shanghai

There are several ways to calculate the distance from Ban Houei to Shanghai. Here are two standard methods:

Vincenty's formula (applied above)
  • 1500.989 miles
  • 2415.608 kilometers
  • 1304.324 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1500.423 miles
  • 2414.697 kilometers
  • 1303.832 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ban Houei to Shanghai?

The estimated flight time from Ban Huoeisay Airport to Shanghai Hongqiao International Airport is 3 hours and 20 minutes.

Flight carbon footprint between Ban Huoeisay Airport (HOE) and Shanghai Hongqiao International Airport (SHA)

On average, flying from Ban Houei to Shanghai generates about 180 kg of CO2 per passenger, and 180 kilograms equals 396 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ban Houei to Shanghai

See the map of the shortest flight path between Ban Huoeisay Airport (HOE) and Shanghai Hongqiao International Airport (SHA).

Airport information

Origin Ban Huoeisay Airport
City: Ban Houei
Country: Laos Flag of Laos
IATA Code: HOE
ICAO Code: VLHS
Coordinates: 20°15′26″N, 100°26′13″E
Destination Shanghai Hongqiao International Airport
City: Shanghai
Country: China Flag of China
IATA Code: SHA
ICAO Code: ZSSS
Coordinates: 31°11′52″N, 121°20′9″E