When did Florida become part of the United States?

Florida, with the Gulf Coast strip of land called West Florida, was the next chunk of Spanish America to be lost.

In 1819, Spain ceded the Floridas to the United States, partly in return for American acknowledgment that Texas was part of New Spain.

That acknowledgment was short-lived.