When did Florida become part of the United States?

Florida, with the Gulf Coast strip of land called West Florida, was the next chunk of Spanish America to be lost.

In 1819, Spain ceded the Floridas to the United States, partly in return for American acknowledgment that Texas was part of New Spain.

That acknowledgment was short-lived.

About Karen Hill

Karen Hill is a freelance writer, editor, and columnist. Born in New York, her work has appeared in the Examiner, Yahoo News, Buzzfeed, among others.

Leave a Comment