Air Miles Calculator logo

How far is Huaihua from Baltimore, MD?

The distance between Baltimore (Baltimore–Washington International Airport) and Huaihua (Huaihua Zhijiang Airport) is 7829 miles / 12600 kilometers / 6803 nautical miles.

Baltimore–Washington International Airport – Huaihua Zhijiang Airport

Distance arrow
7829
Miles
Distance arrow
12600
Kilometers
Distance arrow
6803
Nautical miles

Search flights

Distance from Baltimore to Huaihua

There are several ways to calculate the distance from Baltimore to Huaihua. Here are two standard methods:

Vincenty's formula (applied above)
  • 7829.291 miles
  • 12600.023 kilometers
  • 6803.468 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7815.761 miles
  • 12578.248 kilometers
  • 6791.711 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Baltimore to Huaihua?

The estimated flight time from Baltimore–Washington International Airport to Huaihua Zhijiang Airport is 15 hours and 19 minutes.

Flight carbon footprint between Baltimore–Washington International Airport (BWI) and Huaihua Zhijiang Airport (HJJ)

On average, flying from Baltimore to Huaihua generates about 974 kg of CO2 per passenger, and 974 kilograms equals 2 146 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Baltimore to Huaihua

See the map of the shortest flight path between Baltimore–Washington International Airport (BWI) and Huaihua Zhijiang Airport (HJJ).

Airport information

Origin Baltimore–Washington International Airport
City: Baltimore, MD
Country: United States Flag of United States
IATA Code: BWI
ICAO Code: KBWI
Coordinates: 39°10′31″N, 76°40′5″W
Destination Huaihua Zhijiang Airport
City: Huaihua
Country: China Flag of China
IATA Code: HJJ
ICAO Code: ZGCJ
Coordinates: 27°26′27″N, 109°42′0″E