How Can We Use Trigonometry To Measure the Distance Of a Star From Earth and How Long Does It Take?

In the same way that surveyors measure distances of uncharted land, by creating three points of a triangle, astronomers measure the distances of the stars.

Mathematically, using trigonometry, if you know the length of the baseline of a triangle and the arc of two of its angles, you can figure out the distance to the third point of the triangle, and thus the arc of the third angle.

Because stars are so far away, the triangles needed to compute their distance must be vast.

Astronomers use the average diameter of Earth’s orbit around the Sun, 93 million miles x 2, or 186 million miles, as the baseline of their celestial triangle.

They pinpoint a star in the sky one night, and then again exactly 6 months later, when Earth is on the opposite side of its annual orbit around the Sun.

This gives them the information needed to figure out the star’s distance and parallax, or degree of angle between the two sides of the triangle stemming from the baseline.

Once the parallax is determined, astronomers cut it in half.

This equals the parallax angle using the radius, not the diameter, of Earth’s orbit as the baseline of their triangle.

To measure the distance to a star, astronomers create an imaginary triangle between Earth, the star, and Earth again, when it has completed one-half of its orbit.

Knowing the two base angles, they calculate the third, or parallax angle.

From the angle measurements, they compute the distance to the star.