Documentary History TV Movie War
The Marines

The Marines

Movie Information

For longer than the United States has been an independent nation, there has been a Marine Corps. They consider themselves the very best America has to offer. Embodying fierce patriotism, extraordinary courage, and innovative weapons, they are a force. This documentary focuses on their training and examines what it means to be a Marine.
Director:
Years:
Quality: