Everything Hertz

Informações:

Sinopsis

A podcast by scientists, for scientists. Methodology, scientific life, and bad language. Co-hosted by Dr. Dan Quintana (University of Oslo) and Dr. James Heathers (Northeastern University)

Episodios

  • 40: Meta-research (with Michèle Nuijten)

    24/03/2017 Duración: 49min

    Dan and James are joined by Michèle Nuijten (Tilburg University) to discuss 'statcheck', an algorithm that automatically scans papers for statistical tests, recomputes p-values, and flags inconsistencies. They also cover: - How Michèle dealt with statcheck criticisms - Psychological Science’s pilot of statcheck for journal submissions - Detecting data fraud - When should a journal issue a correction? - Future plans for statcheck - The one thing Michèle thinks that everyone else thinks is crazy - Michèle's most worthwhile career investment - The one paper that Michèle thinks everyone should read Links Michèle's website: https://mbnuijten.com Michèle's twitter account: https://twitter.com/michelenuijten Statcheck: https://statcheck.io Tilberg University meta-research center: http://metaresearch.nl Guardian story on detecting science fraud: https://www.theguardian.com/science/2017/feb/01/high-tech-war-on-science The paper Michèle thinks everyone should read: http://opim.wharton.upenn.edu/DPlab/papers/publishedP

  • 39: Academic hipsters

    10/03/2017 Duración: 54min

    We all know hipsters. You know, like the guy that rides his Penny-farthing to the local cafe to write his memoirs on a typewriter - just because its more ‘authentic’. In this episode, James and Dan discuss academic hipsters. These are people who insist you need to use specific tools in your science like R, python, and LaTeX. So should you start using these trendy tools despite the steep learning curve? Other stuff they cover: Why James finally jumped onto Twitter A new segment: 2-minutes hate The senior academic that blamed an uncredited co-author for data anomalies An infographic ranking science journalism quality that’s mostly wrong When to learn new tools, and when to stick with what you know Authorea as a good example of a compromise between "easy" and "reproducible" Links The science journalism infographic http://www.nature.com/news/science-journalism-can-be-evidence-based-compelling-and-wrong-1.21591 Facebook page www.facebook.com/everythinghertzpodcast/ Twitter account www.twitter.com/hertzpodcast M

  • 38: Work/life balance - Part 2

    24/02/2017 Duración: 01h02min

    Dan and James continue their discussion on work/life balance in academia. They also suggest ways to get your work done within a sane amount of hours as well as how to pick the right lab. Some of the topics covered: Feedback from our last episode Why the podcast started in the first place The "Red Queen" problem Does the "70 hour lab" produce better work? Some experiments aren't suited to a 9-5 schedule More tips for anonomusly skiving off at work What are cognitive limits off focused work? Do early career researchers even earn the minimum wage when you factor in the hours worked? How James gets things done: Work on one thing at a time until it's done and protect your time How Dan gets things done: Pomodoros (40 mins work, 10 minute break), blocking social/news websites How do pick a lab to work in? Links Facebook page https://www.facebook.com/everythinghertzpodcast/ Twitter account https://www.twitter.com/hertzpodcast

  • 37: Work/life balance in academia

    17/02/2017 Duración: 56min

    In this episode, we talk work/life balance for early career researchers. Do you need to work a 70-hour week to be a successful scientist or can you actually have a life outside the lab? Some of the topics covered: An update on "the postdoc that didn't say no" story Brian Wansink's response De-identifying data in research The perils of public criticism Criticising the research vs. criticising the person Some sage advice from a senior academic on "Making science the centre of your life" Look for a boss that won't make insane demands of your time How much good work is really coming out of a 70-hour week? An old hack Dan used to do to pretend he was working on data when he was really just on twitter Links GRIM test calculator http://www.prepubmed.org/grim_test/ Jordan's follow-up post https://medium.com/@OmnesRes/the-donald-trump-of-food-research-49e2bc7daa41#.me8e97z51 Brian Wansink's response http://www.brianwansink.com/phd-advice/statistical-heartburn-and-long-term-lessons The "Making science the centre of yo

  • 36: Statistical inconsistencies in published research

    27/01/2017 Duración: 50min

    In episode 34 we covered a blog post that highlighted questionable analytical approaches in psychology. That post mentioned four studies that resulted from this approach, which a team of researchers took a closer look into. Dan and James discuss the statistical inconsistencies that the authors reported in a recent preprint. Some of the topics covered: Trump (of course) A summary of the preprint The GRIM test to detect inconsistencies The researchers that accidently administered the equivalent of 300 cups of coffee to study participants How do we prevent inconsistent reporting? 21 word solution for research transparency Journals mandating statistical inconsistency checks, such as 'statcheck' Links The pre-print https://peerj.com/preprints/2748/ 'The grad student that didn't say no' blog post http://www.brianwansink.com/phd-advice/the-grad-student-who-never-said-no The caffeine study http://www.bbc.com/news/uk-england-tyne-38744307 Tobacco and Alcohol Research Group lab handbook (see page 6 for open science pr

  • 35: A manifesto for reproducible science

    20/01/2017 Duración: 50min

    Dan and James discuss a new paper in the inaugural issue of Nature Human Behaviour, "A manifesto for reproducible science". Some of the topics covered: What's a manfesto for reproducibility doing in a Nature group journal? Registered reports The importance of incentives to actually make change happen What people should report vs. what they actually report A common pitfall of published meta-analyses The reliance of metrics in hiring decisions and the impact of open science practices Tone police How do we transition to open science practices? SSRN preprints being bought by Elsevier Authors getting gouged by copyediting costs (and solutions) Does being 'double-blind' extend to doing your analysis blind Trial monitoring is expensive Links The paper http://www.nature.com/articles/s41562-016-0021 Our paper on reporting standards in heart rate variability http://www.nature.com/tp/journal/v6/n5/full/tp201673a.html Equator guidelines http://www.equator-network.org Facebook page https://www.facebook.com/everythi

  • 34: E-health (with Robin Kok)

    22/12/2016 Duración: 01h11s

    Dan and James have their very first guest! For this episode they're joined by Robin Kok (University of Southern Denmark) to talk e-health. They also cover a recent blog post that inadvertently highlighted questionable research practices in psychology. Some of the topics covered: The grad student who never said no Postdoc work/life balance Questionable research practices Torturing data (with rattan sticks) Using the GRIM test to assess data accuracy Unpaid internships Saying 'yes' to opportunities that come your way The Myers-Briggs test is rubbish What is e-health? Are e-health interventions efficacious? e-health intervention implementation issues The poor quality of psych intervention smartphone apps Using "Facebook Live" to broadcast conference presentations The future of e-health Links Robin's twitter account https://www.twitter.com/robinnkok "The grad student who never said no" blog post http://www.brianwansink.com/phd-types-only/the-grad-student-who-never-said-no The Buzzfeed quiz on 'Which Disney prince

  • 33: Zombie theories

    16/12/2016 Duración: 43min

    Dan and James discuss Zombie theories, which are scientific ideas that continue to live on in the absence of evidence. Why do these ideas persist and how do we kill them for good? Some of the topics covered: Why do some ideas live on? Zombie theories in heart rate variability research Reasons why zombie theories proliferate more in the social sciences Attractiveness and simplicity Theories become brands Oxytocin zombie theories The power of shaming Ideas are corrected more quickly in smaller fields James' new interest in Cow ECG People using science as a weapon to open up hip pockets How do we kill these zombies for good? Manual vs. automated PubMed comments What's the impact of paper retraction on future citations? How do you correct the scientific record? Links Facebook page https://www.facebook.com/everythinghertzpodcast/ Twitter account https://www.twitter.com/hertzpodcast

  • 32: Can worrying about getting sick make you sicker?

    01/12/2016 Duración: 43min

    Dan and James discuss a new population study that linked health anxiety data with future heart disease. Some of the topics covered: Web MD and health anxiety How would healthy anxiety contribute to heart disease? A summary of the study Ischemic heart disease = coronary artery disease Do people with healthy anxiety take better care of thier health? Don't be fooled by percentage increase of risk for something that's rare There are some things you can't just randomize The pros and cons of big data collection Links The paper http://bmjopen.bmj.com/content/6/11/e012914.full Facebook page https://www.facebook.com/everythinghertzpodcast/ Twitter account https://www.twitter.com/hertzpodcast

  • 31: Discover your psychiatric risk with this one weird trick

    16/11/2016 Duración: 54min

    Dan and James discuss a recent study of over one million Swedish men that found that higher resting heart rate late adolescence was associated with an increased risk for subsequent psychiatric illness. Some of the topics covered: How did these authors get such an enormous dataset? The benefits of testing so many people What we liked about the study (hint: lots of things) Measuring cardiovascular efficiency using a cycle ergometer The pitfalls of self-reported physical activity How the media covered this study Contextual factors - does the testing environment induce anxiety? Co-morbidity in psychiatry What would James do with 200,000 ECGs strips? Links The paper http://jamanetwork.com/journals/jamapsychiatry/fullarticle/2569454 The Daily Mail story http://www.dailymail.co.uk/health/article-3875062/Why-heartbeat-teenager-affect-later-life-Boys-high-blood-pressure-risk-mental-health-problems-adults.html?linkId=30382089 Facebook page https://www.facebook.com/everythinghertzpodcast/ Twitter account https://www.

  • 30: Authorship

    02/11/2016 Duración: 49min

    Dan and James discuss authorship in the biomedical sciences

  • 29: Learning new skills

    16/10/2016 Duración: 48min

    Dan and James talk about how they learn new things. Some of the topics discussed: Internet memes Consolidating old ideas rather than learning new ones Why learn a new skill when you just get someone else to do it? A lesson of not having a good understanding statistical software... James and Dan butt heads about meta-analysis (again) Learning new things is interesting How did people learn things before the internet? How to follow things on Twitter without being on Twitter Links Bayes factor paper with 'primer' paper matrix https://alexanderetz.com/2016/02/07/understanding-bayes-how-to-become-a-bayesian-in-eight-easy-steps/ Facebook page https://www.facebook.com/everythinghertzpodcast/ Twitter account https://www.twitter.com/hertzpodcast

  • 28: Positive developments in biomedical science

    30/09/2016 Duración: 49min

    Pre-registration, p-hacking, power, protocols. All these concepts are pretty mainstream in 2016 but hardly discussed 5 years ago. In this episode, James and Dan talk about these ideas and other developments in biomedical science. Some of the topics discussed: James loves blinded reviews, scihub Dan loves protocols, learning stats through social media, reproducible science Links The COMPARE initiative - http://compare-trials.org "Give me the F-ing PDF" Chrome extension https://chrome.google.com/webstore/detail/give-me-the-f-ing-pdf/iekjpaipocoglamgpjoehfdemffdmami/related Facebook page https://www.facebook.com/everythinghertzpodcast/ Twitter account https://www.twitter.com/hertzpodcast

  • 27: Complaints and grievances

    23/09/2016 Duración: 52min

    Dan and James discuss complaints and grievances. Stay tuned for part 2 where things get positive. Some of the topics discussed: Conflicts of interest Criticism in psychology Why does there seem to be so much bad blood in psychology? Retracted papers: fraud or sloppiness? Authors not acknowledging your peer-review remarks The short-term nature of research The benefits of 'centers of excellence' Links The 'vibe of the thing' scene from 'The Castle' (Great Aussie film) https://www.youtube.com/watch?v=wJuXIq7OazQ Facebook page https://www.facebook.com/everythinghertzpodcast/ Twitter account https://www.twitter.com

  • 26: Interpreting effect sizes

    09/09/2016 Duración: 45min

    When interpreting the magnitude of group differences using effect sizes, researchers often rely on Cohen's guidelines for small, medium, and large effects. However, Cohen originally proposed these guidelines as a fall back when the distribution of effect sizes was unknown. Despite the hundreds of available studies comparing heart rate variability (HRV), Cohen's guidelines are still used for interpretation. In this episode, Dan discusses his recent preprint describing an effect size distribution analysis on HRV studies. Some of the topics discussed: A summary of Dan's preprint What is an effect size? What can an effect size distribution tell us? How effect sizes can inform study planning How close are Cohen's guidelines to the distribution of HRV effect sizes? What samples sizes are appropriate? Pre-publication review vs. post-publication review Statcheck Links The preprint article http://www.biorxiv.org/content/early/2016/08/31/072660 Statcheck https://mbnuijten.com/statcheck/ Facebook page https://www.fa

  • 25: Misunderstanding p-values

    27/08/2016 Duración: 55min

    P-values are universal, but do we really know what they mean? In this episode, Dan and James discuss a recent paper describing the failure to correctly interpret p-values in a sample of academic psychologists. Some of the topics discussed: Common p-value misconceptions James tests Dan on his p-value knowledge p-values vs. effect size The problem of sample size with p-value interpretation The Facebook mood manipulation study Data peeking Equivalent p-values do not represent equivalent results Meta-analytical thinking Using significance as a categorical factor Statistical vs. clinical significance Clinical trial registration and 'secondary outcome creep' Dan and James answer listener questions Science communicator vs. scientist Grant titles and the 'Pub test' NASA and social media Links The article http://journal.frontiersin.org/article/10.3389/fpsyg.2016.01247/full Geoff Cumming's book (we got the name completely wrong - sorry Geoff!) http://www.amazon.com/Understanding-The-New-Statistics-Meta-Analysis-eb

  • 24: Incentive structures in science

    17/08/2016 Duración: 01h20s

    Science funding has a series of built in incentive structures, but what sort of science does this produce? Some of the topics discussed: Feedback from our 'Public health and Pokemon' episode (#22) Incentive structures in science What we should be doing in science compared to what we are doing Quantity vs. Quality The analysis of Trump's tweets for negativity vs. positivity Pre-registration How much detail do you need to go into when it comes to pre-registering an analysis APS journal badges - they're working! Data sharing makes you more careful with your data Solutions to the incentive problem have to come from the policy level The grant funding lottery system proposal The PhD oversupply Gaming the system James wants to mandate science communication Dan wants to include replication studies in PhD programs Scientist names that suit their research area Links The article on incentive structures https://medium.com/the-spike/how-a-happy-moment-for-neuroscience-is-a-sad-moment-for-science-c4ba00336e9c#.x3sea13i1 Th

  • 23: Serious academics

    11/08/2016 Duración: 52min

    Can you be a "serious academic" while still posting photos on Instagram? In this episode, James and Dan discuss a recent article bemoaning the infiltration of the "selfie epidemic" into academia. Some of the topics discussed: James and Dan share their thoughts on the article The REAL 'c' word.... Social media and conferences Snapchat + academics = snapademics Using Instagram stories to share you research Why "PHD comics" is so successful Criticism in academia Listener question 1: What's your favourite part of research? Listener question 2: What's your favourite technique or experiment to perform? Listener question 3: What's a funny story from being an academic? Links The article https://www.theguardian.com/higher-education-network/2016/aug/05/im-a-serious-academic-not-a-professional-instagrammer A response to the article https://www.theguardian.com/science/brain-flapping/2016/aug/05/im-a-non-serious-academic-i-make-no-apologies-for-this Snapademics https://twitter.com/snapademia Facebook page https://www.face

  • 22: Pokemon and public health

    03/08/2016 Duración: 59min

    Pokemon Go is sweeping the world and getting people walking again! But is the Pokemon Go 'model' a golden opportunity to tackle obesity or just another fad? Some of the topics discussed: James plays "Pokemon or Cholesterol medication?" Dan tries to explain Pokemon Go to James James' first contact with Pokemon Go "trainers" Should health interventions be modeled on Pokemon Go? Other exercise augmented reality health apps What's the app's endgame? Can health authorities copy this model? We make a correction from episode 17: PLoS is in fact a non-profit journal, not a for-profit journal Dan and James answer two listener questions: i) The dumbest things they've ever done in the lab (both related to email faux pas) ii) How often should lab meetings be run The importance of PROPERLY piloting your experiment If you don't know the person in the meeting that takes up too much time, it's probably you. Links The quiz http://www.slate.com/articles/technology/gaming/2016/07/pokemonorcholesterolmedicationa_quiz.html Face

  • 21: This is your brain on steroids

    22/07/2016 Duración: 58min

    It's well established that steroid use is associated with many adverse healthy outcomes, but what does it actually do to your brain? Dan and James discuss an interesting new paper that compared brain structure in long-term steroid users and non-using weightlifters. Some of the topics discussed: A summary of the study How are steroids typically used? What are the differences in use between sports? The recruitment of 'real' users James gives Dan a surprise Norwegian test (he doesn't do too well) The things Dan and James liked about the study (hint: many things) Steroid use in women Dose-dependent effects of steroids Folk beliefs surrounding steroid use James' goal of making his cat as jacked as possible If you have a great study, there's no need to oversell James' experience of participating in a growth hormone trial Links The paper http://www.sciencedirect.com/science/article/pii/S000632231632529X Facebook page https://www.facebook.com/everythinghertzpodcast/ Twitter account https://www.twitter.com/hertzpod

página 8 de 9