Where Did the Words Right and Left Come From and What Do the Terms Mean?

The word right surfaced in English as riht and meant “straight.”

To put things right is to straighten them out.

Right took the metaphorical meaning of “good” or “just,” as in the Bill of Rights, because most people were right-handed.

The suggestion that left is incorrect was understood, like in a “left-handed compliment,” which is an insult.

The word right became a synonym for correct, but left was evil and so was left alone.

About Karen Hill

Karen Hill is a freelance writer, editor, and columnist. Born in New York, her work has appeared in the Examiner, Yahoo News, Buzzfeed, among others.

Leave a Comment