If two people based on the same data, say two opposite, and mutually exclusive things, then at least one of them is lying\not in possession of full mental faculties, not “cherry picking”.
I beg to differ. Cherry picking your data and criteria definitely CAN lead to opposite messages. To use an elaboration of the example in the graphic: Say that you have a population of 1 million people that are eligible to work and unemployment last year was at 10%. A million NEW people started looking for work this year (say, immigrants and pple who turned 18), and 500,000 of them have found work.
Person 1: “Employment is up. More people have jobs now than last year”
Person 2: “Unemployment is up, as a percentage of the total workforce”
“Lying by omission” is the correct term\definition of that kind of lying.
That’s why you’re sworn to tell: “[…]the whole truth[…]”.
Also, what some “people” in those pictures are trying to accomplish, is to: “be right”, “win the argument” by any means necessary. We already have definition for that too – eristic.
So yet again, there’s no need to “invent” new definitions, that supposedly are there to define, ergo clear, “pinpoint” things, but are really just splitting the hair by a fuckton, clouding the issue, and creating new divisions.
Oath swearing? The whole truth? In a thread about common data fallacies that can be encountered while the story’s being put together, surely you wouldn’t say at the scene of the crime?
You used to be able to hire oath swearers, back in ye Olde England.
“Despite the early addition of witness oaths to the English common law tradition, witnesses faced no codified penalties for perjury until the mid-16th century. Prior to that, it was believed that the specter of God’s vengeance alone was enough to coax witnesses into telling the unvarnished truth.”
More here
Back in the day, transfer of feudal tenure (and other things) had to have a material element, as in physically standing on the land and handing over a clod of earth in front of witnesses, fascinating. One of those things any excuse to share. It’s called Livery of seisin, if you’re interested.
All said lightly: data fallacies happen, it’s human.
Yes, “oath swearing”, and “whole truth”, as in ideas, concepts that were behind them, not the literal definition.
Scene of the crime what ?
You used to buy slaves (as in “things”, not “beings”) in old USofA. And ?
I purposely didn’t mention god\God, and yet here we are with its Godwin law “equivalent”. Funny how that “happens”.
You also lost me in that: “back in the day, Livery of seisin” bit.
I didn’t say that data fallacies don’t happen.
Yes, the concept behind getting people to swear to tell the truth is serving the task of finding of a verdict. How much of the whole truth comes into the equation there is a big question. But yes, I see you mean the idea of it being best for people making a decision on what to believe to be fully informed. By scene of the crime, I meant that many of the logical fallacies happen at the research stage, not when presenting findings to the court as it were. So, scene of the crime? But then we agree that the court analogy is not what you’re employing here. Interestingly, witnesses are notoriously unreliable and could be said to be at the mercy of data fallacies. God? Oh, right, nope, I didn’t, what’s religious morality got to do with scientific data fallacies? That’s the law’s domain. The livery of seisin was a side note, not relevant to you, not interesting, no worries.
I really just liked the idea of the oath swearing and data fallacies, and made a rambly comment.
“The scientist is not a person who gives the right answers, he is one who asks the right questions.” – Claude Levi-Strauss
Yes, human perception isn’t reliable in best of circumstances – for the starters we’re biased because of our sex; upbringing\family; educational system; things we believe, follow; etc., etc. Crime scene with gruesome scenes, adrenaline pumping, isn’t helping in getting a “clear picture”.
And that’s just bias. If you add active manipulation, you can end up with relativism tuned up to 9001…
If two people based on the same data, say two opposite, and mutually exclusive things, then at least one of them is lying\not in possession of full mental faculties, not “cherry picking”.
I beg to differ. Cherry picking your data and criteria definitely CAN lead to opposite messages. To use an elaboration of the example in the graphic: Say that you have a population of 1 million people that are eligible to work and unemployment last year was at 10%. A million NEW people started looking for work this year (say, immigrants and pple who turned 18), and 500,000 of them have found work.
Person 1: “Employment is up. More people have jobs now than last year”
Person 2: “Unemployment is up, as a percentage of the total workforce”
Yes, but without your elaboration, one of those “examplees” is what I wrote earlier.
“Lying by omission” is the correct term\definition of that kind of lying.
That’s why you’re sworn to tell: “[…]the whole truth[…]”.
Also, what some “people” in those pictures are trying to accomplish, is to: “be right”, “win the argument” by any means necessary. We already have definition for that too – eristic.
So yet again, there’s no need to “invent” new definitions, that supposedly are there to define, ergo clear, “pinpoint” things, but are really just splitting the hair by a fuckton, clouding the issue, and creating new divisions.
srry. repeat post. oops.
Oath swearing? The whole truth? In a thread about common data fallacies that can be encountered while the story’s being put together, surely you wouldn’t say at the scene of the crime?
You used to be able to hire oath swearers, back in ye Olde England.
“Despite the early addition of witness oaths to the English common law tradition, witnesses faced no codified penalties for perjury until the mid-16th century. Prior to that, it was believed that the specter of God’s vengeance alone was enough to coax witnesses into telling the unvarnished truth.”
More here
Back in the day, transfer of feudal tenure (and other things) had to have a material element, as in physically standing on the land and handing over a clod of earth in front of witnesses, fascinating. One of those things any excuse to share. It’s called Livery of seisin, if you’re interested.
All said lightly: data fallacies happen, it’s human.
Yes, “oath swearing”, and “whole truth”, as in ideas, concepts that were behind them, not the literal definition.
Scene of the crime what ?
You used to buy slaves (as in “things”, not “beings”) in old USofA. And ?
I purposely didn’t mention god\God, and yet here we are with its Godwin law “equivalent”. Funny how that “happens”.
You also lost me in that: “back in the day, Livery of seisin” bit.
I didn’t say that data fallacies don’t happen.
Yes, the concept behind getting people to swear to tell the truth is serving the task of finding of a verdict. How much of the whole truth comes into the equation there is a big question. But yes, I see you mean the idea of it being best for people making a decision on what to believe to be fully informed. By scene of the crime, I meant that many of the logical fallacies happen at the research stage, not when presenting findings to the court as it were. So, scene of the crime? But then we agree that the court analogy is not what you’re employing here. Interestingly, witnesses are notoriously unreliable and could be said to be at the mercy of data fallacies. God? Oh, right, nope, I didn’t, what’s religious morality got to do with scientific data fallacies? That’s the law’s domain. The livery of seisin was a side note, not relevant to you, not interesting, no worries.
I really just liked the idea of the oath swearing and data fallacies, and made a rambly comment.
“The scientist is not a person who gives the right answers, he is one who asks the right questions.” – Claude Levi-Strauss
Yes, human perception isn’t reliable in best of circumstances – for the starters we’re biased because of our sex; upbringing\family; educational system; things we believe, follow; etc., etc. Crime scene with gruesome scenes, adrenaline pumping, isn’t helping in getting a “clear picture”.
And that’s just bias. If you add active manipulation, you can end up with relativism tuned up to 9001…
As for the rest, we seem to also agree 🙂