Why Vessels Use Nautical Miles

The nautical mile is a unit of measurement used in navigation and aviation to measure distances at sea. It is based on the circumference of the Earth and was first used by the ancient Greeks and Romans for navigation purposes.

The origins of the nautical mile can be traced back to the ancient Greek civilization, where sailors used a device known as a groma to measure distances between ports. This device was essentially a wooden frame with crosshairs that could be used to line up distant objects. The Greeks measured these distances in units of stadia, with one stadion roughly equal to 185 meters.

The Romans later adopted this system of measurement and used it for their extensive naval operations in the Mediterranean. They standardized the length of the stadion to 185.32 meters, which is very close to the modern definition of the nautical mile.

Over time, the nautical mile evolved into its modern definition, which is based on the circumference of the Earth. In the 17th century, the English astronomer Edmund Gunter developed a method of using the stars to determine longitude at sea, which required an accurate measurement of the Earth's circumference. He calculated the Earth's circumference to be 24,901.45 miles, which he divided by 360 to get a value of 69.17 miles per degree of arc. This value became the basis for the modern nautical mile, which is defined as one minute of arc along a meridian of the Earth's surface. Today, the nautical mile is used worldwide by mariners and aviators to measure distances at sea and in the air. It is defined as exactly 1,852 meters, which is slightly longer than a land-based mile (which is defined as 1,609 meters).

We are Baja Charters

Follow Us on Social Media