Do you already consider yourself complete as an individual, or in the U.S. do you seek completion by extending yourself to society and the work force? While on one hand, work does amount to a sense of security and helps us feel more confident at what we do---I personally feel that we should already be complete within ourselves before we start working and continue to feel complete, without work making us who we are. This conversation I had with my father made me realize how in a consumerist society, one is prey to the role we play within the system...one we cannot easily avoid. Shouldn't everyone deserve fundamental things: food, homes, opportunities? If we are born with the God-given right to be free, shouldn't our government uphold this principle and not let us be poor or go hungry in America?