I hate how so many people think that marketing is so important to our society. We would be able to make it through our days much easier if there was not marketing or advertising of any kind. Why do we have it?:glowingeyes_sml: Do you think that we need to increase or decrease the advertisement around us to better improve America on a social level?



