Is organic really better?




It’s only been about a year since I’ve started to consciously buy more organic products. Before I somehow just didn’t give it much thought, if anything I felt it was a bit of a rip off for something that was mainly labelled differently. So is organic really better or just more expensive? It appears that in terms of health benefits the jury is still out, although I think the more important question shouldn’t necessarily be whether organic is healthier, but if conventional is potentially harmful. I’m not a scientist and I certainly don’t know anything about pesticides, but to me it seems logical that the absence of them can’t be a bad thing (for both human and planetary health). I often hear the argument that organic farmers can’t produce as much as conventional farmers and that therefore more land is needed, which raises the question of sustainability. Given the fact that approx. 77% of all agricultural land is used for livestock (both grazing land and land used for animal feed production) I can’t help but think that we could easily feed the whole world an organic diet if meat wasn’t in the picture.. Of course the issue of affordability remains and not everyone can eat organic all the time – if at all. To eat more fruit and vegetables is always a good idea, organic or not and I know how expensive it is to feed a family, so I try to buy foods that are most affected by pesticides organic and worry a bit less about the rest. I find the ‘dirty dozen' (most affected) and the ‘clean fifteen’ (least affected) a really helpful guide and try to buy things when they are in season, which often means they are cheaper too.







YOU ARE WHAT YOU EAT