Air Miles Calculator logo

How far is Petersburg, AK, from Wuhan?

The distance between Wuhan (Wuhan Tianhe International Airport) and Petersburg (Petersburg James A. Johnson Airport) is 5248 miles / 8445 kilometers / 4560 nautical miles.

Wuhan Tianhe International Airport – Petersburg James A. Johnson Airport

Distance arrow
5248
Miles
Distance arrow
8445
Kilometers
Distance arrow
4560
Nautical miles

Search flights

Distance from Wuhan to Petersburg

There are several ways to calculate the distance from Wuhan to Petersburg. Here are two standard methods:

Vincenty's formula (applied above)
  • 5247.572 miles
  • 8445.148 kilometers
  • 4560.015 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5235.988 miles
  • 8426.505 kilometers
  • 4549.949 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Wuhan to Petersburg?

The estimated flight time from Wuhan Tianhe International Airport to Petersburg James A. Johnson Airport is 10 hours and 26 minutes.

Flight carbon footprint between Wuhan Tianhe International Airport (WUH) and Petersburg James A. Johnson Airport (PSG)

On average, flying from Wuhan to Petersburg generates about 616 kg of CO2 per passenger, and 616 kilograms equals 1 359 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Wuhan to Petersburg

See the map of the shortest flight path between Wuhan Tianhe International Airport (WUH) and Petersburg James A. Johnson Airport (PSG).

Airport information

Origin Wuhan Tianhe International Airport
City: Wuhan
Country: China Flag of China
IATA Code: WUH
ICAO Code: ZHHH
Coordinates: 30°47′1″N, 114°12′28″E
Destination Petersburg James A. Johnson Airport
City: Petersburg, AK
Country: United States Flag of United States
IATA Code: PSG
ICAO Code: PAPG
Coordinates: 56°48′6″N, 132°56′42″W