Air Miles Calculator logo

How far is Ponta Porã from Pucon?

The distance between Pucon (Pucón Airport) and Ponta Porã (Ponta Porã International Airport) is 1497 miles / 2410 kilometers / 1301 nautical miles.

The driving distance from Pucon (ZPC) to Ponta Porã (PMG) is 2069 miles / 3330 kilometers, and travel time by car is about 42 hours 59 minutes.

Pucón Airport – Ponta Porã International Airport

Distance arrow
1497
Miles
Distance arrow
2410
Kilometers
Distance arrow
1301
Nautical miles

Search flights

Distance from Pucon to Ponta Porã

There are several ways to calculate the distance from Pucon to Ponta Porã. Here are two standard methods:

Vincenty's formula (applied above)
  • 1497.452 miles
  • 2409.916 kilometers
  • 1301.251 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1498.829 miles
  • 2412.132 kilometers
  • 1302.447 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pucon to Ponta Porã?

The estimated flight time from Pucón Airport to Ponta Porã International Airport is 3 hours and 20 minutes.

Flight carbon footprint between Pucón Airport (ZPC) and Ponta Porã International Airport (PMG)

On average, flying from Pucon to Ponta Porã generates about 179 kg of CO2 per passenger, and 179 kilograms equals 395 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Pucon to Ponta Porã

See the map of the shortest flight path between Pucón Airport (ZPC) and Ponta Porã International Airport (PMG).

Airport information

Origin Pucón Airport
City: Pucon
Country: Chile Flag of Chile
IATA Code: ZPC
ICAO Code: SCPC
Coordinates: 39°17′34″S, 71°54′57″W
Destination Ponta Porã International Airport
City: Ponta Porã
Country: Brazil Flag of Brazil
IATA Code: PMG
ICAO Code: SBPP
Coordinates: 22°32′58″S, 55°42′9″W