Hot Best Seller

Truth or Truthiness: Distinguishing Fact from Fiction by Learning to Think Like a Data Scientist

Availability: Ready to download

Teacher tenure is a problem. Teacher tenure is a solution. Fracking is safe. Fracking causes earthquakes. Our kids are over-tested. Our kids are not tested enough. We read claims like these in the newspaper every day, often with no justification other than 'it feels right'. How can we figure out what is right? Escaping from the clutches of truthiness begins with one simple Teacher tenure is a problem. Teacher tenure is a solution. Fracking is safe. Fracking causes earthquakes. Our kids are over-tested. Our kids are not tested enough. We read claims like these in the newspaper every day, often with no justification other than 'it feels right'. How can we figure out what is right? Escaping from the clutches of truthiness begins with one simple question: 'what is the evidence?' With his usual verve and flair, Howard Wainer shows how the sceptical mind-set of a data scientist can expose truthiness, nonsense, and outright deception. Using the tools of causal inference he evaluates the evidence, or lack thereof, supporting claims in many fields, with special emphasis in education. This wise book is a must-read for anyone who has ever wanted to challenge the pronouncements of authority figures and a lucid and captivating narrative that entertains and educates at the same time.


Compare

Teacher tenure is a problem. Teacher tenure is a solution. Fracking is safe. Fracking causes earthquakes. Our kids are over-tested. Our kids are not tested enough. We read claims like these in the newspaper every day, often with no justification other than 'it feels right'. How can we figure out what is right? Escaping from the clutches of truthiness begins with one simple Teacher tenure is a problem. Teacher tenure is a solution. Fracking is safe. Fracking causes earthquakes. Our kids are over-tested. Our kids are not tested enough. We read claims like these in the newspaper every day, often with no justification other than 'it feels right'. How can we figure out what is right? Escaping from the clutches of truthiness begins with one simple question: 'what is the evidence?' With his usual verve and flair, Howard Wainer shows how the sceptical mind-set of a data scientist can expose truthiness, nonsense, and outright deception. Using the tools of causal inference he evaluates the evidence, or lack thereof, supporting claims in many fields, with special emphasis in education. This wise book is a must-read for anyone who has ever wanted to challenge the pronouncements of authority figures and a lucid and captivating narrative that entertains and educates at the same time.

30 review for Truth or Truthiness: Distinguishing Fact from Fiction by Learning to Think Like a Data Scientist

  1. 4 out of 5

    Joseph Spuckler

    "There are three kinds of lies: lies, damned lies, and statistics." Attributed to Benjamin Disraeli Truth or Truthiness: Distinguishing Fact from Fiction by Learning to Think Like a Data Scientist by Howard Wainer is a study of the information and how it is used in modern society. Wainer is an American statistician, past Principal Research Scientist at the Educational Testing Service, adjunct Professor of Statistics at the Wharton School of the University of Pennsylvania. We are bombarded with st "There are three kinds of lies: lies, damned lies, and statistics." Attributed to Benjamin Disraeli Truth or Truthiness: Distinguishing Fact from Fiction by Learning to Think Like a Data Scientist by Howard Wainer is a study of the information and how it is used in modern society. Wainer is an American statistician, past Principal Research Scientist at the Educational Testing Service, adjunct Professor of Statistics at the Wharton School of the University of Pennsylvania. We are bombarded with statistics in our daily lives. One news station will show statistics that the economy is failing another that things are on the upswing. The death penalty is a crime deterrent, but Texas continues to execute more people than any other state (but ranks 11th if taken by executions per capita) and 16th in violent crime. Alaska ranks first in violent crime and has no death penalty. Vermont has the lowest level of violent crime and no death penalty. There is seems to be little in correlation in the death penalty and violent crime. Perhaps more information is needed. Wainer brings to the table a simpler example to the table that there is a strong correlation between the number of people eating ice cream and the number of people drowning. When ice cream eating spikes, so do the number of drownings. There must be a connection between the two. Actually, there isn’t. When the weather gets warmer more people take part in eating ice cream and swimming. The more people that swim the higher the number of drowning victims. One could take these figures and, wrongly, conclude that swimming and eating ice cream leads to higher temperatures perhaps a point for snowball throwing Senator Jim Inhofe Missing information and how it is treated is as important as the information present. A company questionnaire asks employees how happy they are are with their jobs. The company reported that 80% of the respondents were happy or very happy. What is missing from the equation is that only 22% of the employees were motivated enough to complete the questionnaire. Many times missing information is much more complicated. In long-term studies, not everyone continues the study. If the study was following smokers, for example, what is to be done with the subjects that quit smoking? Those who die from non-smoking related disease and accidents? Those who just don’t want to participate anymore? Wainer gives examples and ways to deal with missing information without skewing the results. Other problems are what about information that was not realized at the time. Cigarette smoking was a leading cause of preventable death in America and obesity was not that great of a concern. The problem was that smokers tended generally to be thinner than nonsmokers skewing the rate information on obesity. Thinner people died at a higher rate than the obese because of the number of smokers. Wainer takes on a variety of popular issues such as SAT tests, Teacher tenure, fracking, test cheating, standardized tests and a variety of other hot social issues. He starts slowly with simple examples separating truth from truthness and move to more complex problems. He even examine graphs and shows how results can be hidden by the type of graph being used. Truth or Truthiness is a study of understanding information and data and interpreting it in a useful manner. It means for us to question what we see and hear to check the data and who supplies the data and determine how truthful it really is or if it is simply serving another group’s needs by appealing to your emotions and “gut feelings.” A very good read in our age of quick information, unofficial polls, and truthness.

  2. 4 out of 5

    Stephen Rush

    This is not a book most will like. Without the correct priming Wainer can be difficult to understand. I read this as a supplement in a doctoral program to better understand how to think through problems from the angle of a statistician and researcher. Snooze fest sentence I know but don’t let me lose you. We would regularly discuss Wainer and the studies he proposed. Even though brilliant, he packs in too much information for a casual read thru. Because of this, each chapter needs to be slowly r This is not a book most will like. Without the correct priming Wainer can be difficult to understand. I read this as a supplement in a doctoral program to better understand how to think through problems from the angle of a statistician and researcher. Snooze fest sentence I know but don’t let me lose you. We would regularly discuss Wainer and the studies he proposed. Even though brilliant, he packs in too much information for a casual read thru. Because of this, each chapter needs to be slowly read and then reread. I learned a ton from this and will reread it as I am sure I missed a ton. The cover will make people think it’s simple. This book is not for the casual reader or someone looking for a night-stand book.

  3. 5 out of 5

    Jacinta

    This book is terrible and the few entry-level good points it has are overwhelmed by the idiocy. I struggled almost halfway through it and then came across this image, their example of their own "clear representation in the change of tax sources over time". WTF. Did not finish. Do not recommend. https://books.google.com/books?id=OHM... This book is terrible and the few entry-level good points it has are overwhelmed by the idiocy. I struggled almost halfway through it and then came across this image, their example of their own "clear representation in the change of tax sources over time". WTF. Did not finish. Do not recommend. https://books.google.com/books?id=OHM...

  4. 5 out of 5

    Mehmet

    Rubin's method for analyzing clinical-trial data that has dropouts (or deaths) before the trial's complete blew my mind. Rubin's method for analyzing clinical-trial data that has dropouts (or deaths) before the trial's complete blew my mind.

  5. 4 out of 5

    Angie Reisetter

    Wainer's stated purpose in writing this book is to help his readers develop habits of mind that allow them to distinguish truth from things that feel right but actually aren't (truthiness). His annotated table of contents is very clear and well-organized, and he makes some very good points. He walks us through what kinds of experiments would be necessary to prove certain claims, talks about what we should then do if those experiments aren't possible (they often aren't), and concludes with some v Wainer's stated purpose in writing this book is to help his readers develop habits of mind that allow them to distinguish truth from things that feel right but actually aren't (truthiness). His annotated table of contents is very clear and well-organized, and he makes some very good points. He walks us through what kinds of experiments would be necessary to prove certain claims, talks about what we should then do if those experiments aren't possible (they often aren't), and concludes with some very interesting examples, including the controversial teacher tenure and testing that are mentions in the book blurb. Wainer worked for the Educational Testing Services for years, and his examples involving testing are enlightening and worth pulling out for discussion in and of themselves. A few of his case studies in the later chapters are pretty well-researched and take into account history and context in guessing at why the statistics say what they do. Really good stuff. I'm glad I stuck it out and got to them, because I found the opening chapters of the book, in which Wainer is laying out how to think correctly, a bit off-putting. The book blurb talks about "his trademark verve and irreverence", but mostly I would call his overall voice snarky. And sometimes that's amusing. We especially like to let old men get away with it. If you're in the mood to hear faux nostalgia for the good old days before Nate Silver when statisticians were uncool and he could be left in peace on an airline trip, this is the book for you. And he would love to show you his son's Princeton acceptance letter -- yep, printed in right in there, with a statistic on how rare they are just in case you didn't know. He makes snide comments about cheaters, sure, but also about almost everyone else his stories run across. So you have to be in the mood to be amused by a curmudgeon. Also you have to let him get away with being dismissive of any guessing or reasoning in case studies in different directions that clearly don't interest him much. I almost refrained from saying we'd never let a woman write like this, but there it sneaked in. But if you can get past that, this is really a good outline for a few tests to put statistics through before you believe them. That's a good tool to arm yourself with when venturing out into the internet. And a good chapter on what to look for in a graphic and how data can hide in plain site. He also ends with a chapter titled "Don't try this at home", which must be some kind of humor I don't understand, because he very much encouraged us to try this at home and shares stories of people (strangely focusing on one very unusual family) who do investigate statistics with impressive results. All in all, I learned a lot, and that's a great thing, but I was looking for something that I could unequivocally recommend to my science-major students, and I'm not sure this is it. But if you're ready to sit back and hear what this character has to say and learn what you can from him, you won't regret it. I got a free copy of this from Net Galley.

  6. 5 out of 5

    Debbie

    "Truth or Truthiness" is about how to design better causal studies and better graphs. It's mainly targeted at people working in education who can influence policies about testing, tenure, and such. It's written in a very formal way and uses technical language. The author assumed the reader already knew what a "longitudinal study" and "cross-sectional study" are, for example, and that you understand words like "ancillary information," "covariates," and "legerdemain." Some words were defined, but "Truth or Truthiness" is about how to design better causal studies and better graphs. It's mainly targeted at people working in education who can influence policies about testing, tenure, and such. It's written in a very formal way and uses technical language. The author assumed the reader already knew what a "longitudinal study" and "cross-sectional study" are, for example, and that you understand words like "ancillary information," "covariates," and "legerdemain." Some words were defined, but often pages after the author first used them. While the author was inspired by real claims or studies, many of his Case Studies used made up data to illustrate his point. He explained how to set up a random-assignment controlled experiment, which is the gold standard when possible. He then explained ways to increase the accuracy of observational studies, like gathering additional information when randomization is impossible and how to interpret the results while including missing data (from people dropping out of the study, dying, etc.). He showed how to use extrapolation, ways to deal with unexpected events, and how to create effective graphs to clearly present the information discovered in a study. He also did some ranting about current education policies (removing tenure, detecting cheating, measuring school performance, changes in the SAT, and the accuracy of subscores). While the information about creating better studies seemed useful, I did not care for his mocking, dismissive tone. For example, he acknowledges that there may be a missing "third variable" in regards to fracking apparently causing increased earthquakes. However, since he can't think of one, there must not be one. People who have pointed out (in their own way) that it's not a certain cause-effect get mocked by the author. I happen to agree with him about fracking, but he mocks people for assuming things because "it makes sense to them." Yet when he does it, his conclusions are based solely on "logic and evidence" and everyone else is either stupid or corrupt. I received this book as a review copy from the publisher.

  7. 5 out of 5

    Daniel Christensen

    The book is a sort of mash up of epistemology, statistics, and general musings. I thought early on, it was going to be more tightly focussed on truth versus truthiness, then I thought it was going to be on the Don Rubin causal model, then I thought it was a manifesto for modern data science. It was a bit looser than this, but an enjoyable read nonetheless. The author has a particular interest in education, and this informs many (but not all) the examples. If there were themes tying this together, The book is a sort of mash up of epistemology, statistics, and general musings. I thought early on, it was going to be more tightly focussed on truth versus truthiness, then I thought it was going to be on the Don Rubin causal model, then I thought it was a manifesto for modern data science. It was a bit looser than this, but an enjoyable read nonetheless. The author has a particular interest in education, and this informs many (but not all) the examples. If there were themes tying this together, other than an epistemological bent to his statistics, it is the idea of always starting with an idealised experiment, then working back to reality, and a focus on clear thinking and communication. I even had that deceptive sense of understanding the 'counter factual' while I was reading the book, although sadly that passed. This isn’t bland. The author’s voice comes through clearly. He has a great knack for quotes and turns of phrase that support his interests. “Dear God, make my enemies ridiculous” Voltaire. “I will listen to any hypothesis, but on one condition – that you show me a method by which it can be tested.” von Hofman Its another one that gets a worthy spot on the bookshelf, and would do with a bit more note-taking. It's not so technical that the interested generalist would necessarily struggle through. BTW truth in the book’s case is belief backed up by evidence. Truthiness is when you believe something based purely on the feels. He has a bit of fun with this.

  8. 4 out of 5

    Aaron Dietz

    Parts of this book were great--loved the part on fracking and some of the pieces about education in general. Really clarified a few things with some nice visualizations and talks about data. I think where this didn't do well with me is that it was always flirting with getting more detailed but never quite going there, didn't quite hit with clarity and focus regarding truly thinking like a data scientist, and the writing style was a bit parse-heavy. Overall I think this is a decent quick look at Parts of this book were great--loved the part on fracking and some of the pieces about education in general. Really clarified a few things with some nice visualizations and talks about data. I think where this didn't do well with me is that it was always flirting with getting more detailed but never quite going there, didn't quite hit with clarity and focus regarding truly thinking like a data scientist, and the writing style was a bit parse-heavy. Overall I think this is a decent quick look at several different topics, though, and would ultimately recommend it for its browsing potential (skim through it for visuals and subject matter that appeals to you).

  9. 4 out of 5

    Angelo

    The author put in a very straightforward way some of the main key point every data scientist needs to be aware of. How to disentangle correlation from causation, spotting third variables that can influence our conclusions and so on. I really enjoyed reading it. I totally advice it to anybody wants to get his hands dirty with data.

  10. 5 out of 5

    Chris Callaway

    Tough sledding at times, and the author can't help but get unnecessary jabs in at conservative politicians, but there's some interesting and useful stuff in here. Tough sledding at times, and the author can't help but get unnecessary jabs in at conservative politicians, but there's some interesting and useful stuff in here.

  11. 5 out of 5

    Cambridge Press

  12. 5 out of 5

    Andy

  13. 4 out of 5

    Yunus Emre Bakiler

  14. 5 out of 5

    Daniel

  15. 5 out of 5

    Adam

  16. 4 out of 5

    Dan Ust

  17. 4 out of 5

    Hayden Eastwood

  18. 5 out of 5

    Robert Kosara

  19. 4 out of 5

    Meredith

  20. 5 out of 5

    Mateusz Lyska

  21. 5 out of 5

    Josh Ellis

  22. 4 out of 5

    Sara

  23. 4 out of 5

    Ricardo Gutierrez

  24. 4 out of 5

    Lasse

  25. 4 out of 5

    Ben Eggleston

  26. 5 out of 5

    Roy W. Latham

  27. 5 out of 5

    Emily

  28. 4 out of 5

    Jorg

  29. 4 out of 5

    Eric

  30. 4 out of 5

    Claude Marseille

Add a review

Your email address will not be published. Required fields are marked *

Loading...
We use cookies to give you the best online experience. By using our website you agree to our use of cookies in accordance with our cookie policy.