When did Florida become part of the United States?

Florida, with the Gulf Coast strip of land called West Florida, was the next chunk of Spanish America to be lost.

In 1819, Spain ceded the Floridas to the United States, partly in return for American acknowledgment that Texas was part of New Spain.

That acknowledgment was short-lived.

when did florida become part of the united states
About Karen Hill

Karen Hill is a freelance writer, editor, and columnist for zippyfacts.com. Born in New York, she loves interesting random facts from all over the world.