Where Did the Words Right and Left Come From and What Do the Terms Mean?

The word right surfaced in English as riht and meant “straight.”

To put things right is to straighten them out.

Right took the metaphorical meaning of “good” or “just,” as in the Bill of Rights, because most people were right-handed.

where did the words right and left come from and what do the terms mean

The suggestion that left is incorrect was understood, like in a “left-handed compliment,” which is an insult.

The word right became a synonym for correct, but left was evil and so was left alone.

About Karen Hill

Karen Hill is a freelance writer, editor, and columnist for zippyfacts.com. Born in New York, she loves interesting random facts from all over the world.