Once upon a time, the only people that rocked tattoos in America were soldiers. Today, getting inked is a major part of our social landscape. Tattoos are now seen on supermodels to CEOs and have become largely commonplace. Even though the stigma isn't the same as it was years ago, some people feel strongly about not putting anything permanent on their body. What's your view on tattoos? Is you body a blank canvas or a temple that should be left untouched? Are you into tattoos or do you prefer the au naturel look?