United States Navy

History
The United States Navy is a military branch of the U.S. Government. They are primarily men and women fighting on the ground to protect America from foreign threats.