I respectfully have to disagree. There are many thousands of studies that are not applicable, have incorrect methods, come to faulty conclusions that the data does not support, have improper equipment to measure or don't take into account the sensitivity needed for certain measurements of data, and a host of other factors that could easily discredit a study or at least it's applicability to something.
I understand where you're coming from but correlation does not prove causation and I know you never stated causation but it can be implied from the tone of the article. One faulty thing doesn't necessarily strip the merits of a study but in nutrition, as I previously mentioned, questionnaires are NOTORIOUSLY inaccurate in obtaining data points for caloric/macro intake. That's not a minor fault in a study such as this, it could be potentially catastrophic to the study and it's findings and conclusions.
I hope you do win the lottery tomorrow and decide to share

but that's a red herring as far as I'm concerned.
And so bc they can't count their macros how does a simple questionnaire accurately gauge diet and lifestyle factors? Plus this is being posted on a BB/AAS forum where most do know how to track macros (they may not all do it but most know how to do it).
Also why would you not want to look at a study's faults to better asses it's relevance? If I posted a study saying Santa clause is real would you not look for faults in the study but trust it on the surface? Way too many studies are not wholly applicable to what they're made out to be and way too many people are incorrectly interpreting conclusions from data sets that don't back up the conclusion n