Who Brought Christianity To America?
Spanish missions Catholicism first came to the territories now forming the United States just before the Protestant Reformation (1517) with the Spanish conquistadors and settlers in present-day Florida (1513) and the southwest. How was Christianity introduced to America? Christianity was introduced to the Americas as it was first colonized by Europeans beginning in the 16th …