Since the dawn of film, the indigenous American art form known as the Western genre shaped American society and throughout the years it became an iconic part of American culture. Considered by many as one of America’s few original art forms, the Western genre defined the early days of the expansive American frontier, mostly in the last part of the 19th century.
Western movies portray dangerous adventures in the wilderness, beautiful landscapes, lonesome cowboys, gunfights, saloons, ranch houses, and conflicts between white settlers and Native Americans.
More @ The Vintage News
I learn something new every day. indyjonesouthere
ReplyDeleteRE Lee said you would until you died.
Delete