So I must admit that I stopped eating meat after watching the documentary Food, Inc. If you haven't seen it, I highly recommend it. It highlights a lot of things that are wrong with our (the American) food industry. However, as all documentaries are, Food, Inc. is undoubtedly biased. I was wondering if there were any other food documentaries that you all knew of that might shine some light on the other side of the situation. When people ask me why I don't eat meat, I often launch into tirades about our tyrannical food industry and the lies our government feeds us right alongside that hormone-pumped, ammonia-washed burger, but I hate nothing more than being misinformed, thus my inquiry. 3nodding