the American dream
noun /ði əˌmerɪkən ˈdriːm/
  /ði əˌmerɪkən ˈdriːm/
 [singular]- the belief that America offers the opportunity to everyone of a good and successful life achieved through hard work- Born a poor boy in Kansas, he lived the American dream as a successful inventor.