Tag: stats

  • Numbers, please

    I don’t normally go around asking for stats. I’ll take a good anecdote over a slippery statistics any day.

    And yet… I feel like an old operator at times saying, “Number, please.”

    Last night I was writing and had a very simple question: how many US prisoners are in solitary confinement? Seems like a simple and important question since this a free country and solitary confinement has been proven to drive people crazy.

    Get this… we don’t know. How can we not know? I don’t think you have to be a bleeding heart to think we should know how many people are locked up in solitary confinement. Isn’t not knowing a sign of the gulag?

    Then by chance there’s a story in USA Today about solitary. At least from the Illinois figure we can extrapolate to the rest of the nation. So I would guess between 40,000 and 80,000.

    Speaking of numbers, there’s this story in The Wall Street Journalabout, a 54-year-old librarian in Las Cruces, New Mexico, who “spends most mornings sifting reports in the Mexican press to create a tally of drug-cartel-related killings in Ciudad Juárez, Mexico.”

    Why? Because nobody else is keeping track. The paper points out, “There is no official count of the people killed in Mexico’s escalating drug wars—whether the victims are drug traffickers, police or civilians.”

    In Juarez, the tally this year already (it’s June) is over a thousand. “I don’t think there’s a phenomenon like that in the world unless it’s a declared war,” Ms. Molloy said, “Ten years from now, people are going to ask ‘What happened in Juárez?’”

    When I see fancy stats I’m always skeptical (especially when they’re based on data of questionable validity). But a basic count? A simple population figure? Solitary confinement? Murders? People… these are numbers we need!

    [Update: LEAP board member Walter McKay lives in Mexico and keeps track of the numbers. He posts on the LEAP Blog. He also maintains a Google map of the murders.]

  • A felony just ain’t what it used to be!

    Lost in all the talk about the NYPD juking the stats is the simple fact that each and every year, the value of felony theft (“grand larceny” in NY State) goes down with inflation.

    New York State defines felony grand larceny (§155.30) as over $1,000. And this is where it’s been for the past 25 years.

    This makes the 64% reduction in grand larceny over the past 20 years all the more impressive since inflation alone has stripped almost 40% of a felony’s value.

    By lowering the value of a felony, we’re cheapening its meaning and labeling more and more people as felons. And is harmful and costly for all of us.

    $1,000 today is closer to the $275 a felony larceny was raised fromin 1986 (and where it was from 1965 to 1986). To keep the value of a felony consistent, it’s time to raise the dollar amount to $1,600 – $1,900. But since this figures stick with us for 20 or 30 years, why not jack it up to an even two-grand?

    Last time we did this, serious felony crime in New York City decreased 11% overnight. And this despite rising crime!

    Here’s to a $2,000 felony! Let the movement start here.

    [Thanks to a police officer for raising this question and to a John Jay librarian who dug up this hard-to-find information on a moment’s notice! Ain’t librarians grand?!]

    p.s. While we’re at it, maybe it’s time to adjust that “$20” figure in the 9th Amendment, too.

  • Those Slippery Stats

    Somebody tried to do to me what I tried to do to the Heritage Foundation. I was accused of playing fast and loose the numbers in my Washington Post op-ed.

    In the old days I could have just challenged him to a duel. I’d feel pretty confident going into that battle! Instead I have to defend my honor with a written reply to this:

    “In many ways, Dante Arthur was lucky. He lived. Nationwide, a police officer dies on duty nearly every other day.” [emphasis added]

    Let’s see – 365 days a year. That makes nearly 180 such deaths each year.

    I’ve been out of the crim biz for a while, but that number sounded high to me. So I went to the UCR. Sure enough, in 2007, 140 police officers died in the line of duty. As Moskos and Franklin say, nearly one every other day.

    But 83 of those officers died in accidents, only 57 were homicide victims – one every 6 days. Still a lot. But how many of those were drug-related? The UCR has the answer:

    One.

    Nor was 2007 unusual. In the decade ending 2007, 1300 police officers died on the job. About 550 of these were in felonies, not accidents. And of these, 27 were drug-related. Three a year is still too many, but it’s a far cry from one every other day.

    Maybe I should have looked at a DVD of The Wire instead of the UCR.

    Moskos and Franklin argue that federal laws should allow states to make the manufacture and distribution of drugs legal and regulated rather than criminal. The authors make several good arguments against current drug laws, which have created many problems that legalization might ameliorate. But I’m skeptical as to whether legalization would make much of a difference in police safety.

    You can read the whole post here.

    The Wire line is ironic since both Franklin and I actually policed the streets of The Wire.

    I replied with this:

    I take my numbers seriously and I criticize others for exactly what you’ve criticized me for. So I feel I need to defend myself thoroughly. You’re not being fair to me.

    As is often the case, a little qualitative insight is needed to round out the quantitative data. The numbers aren’t showing the real picture. You have too much faith in the UCR numbers. For what it’s worth, I was in the unique position of actually putting data into the URC for a year before analyzing the same data coming out the other end. Sort of a unique position for a researcher (conflict of interest?), but I can actually identify some of the UCR homicides in 2000/2001 as “mine.”

    First the non-disputed part.

    The best source for info on officer deaths is The Officer Down Memorial Page. It’s much more detailed than the UCR (and probably more accurate, too). Over the past four years, the average deaths per year is 162.5. Not half of 365, but close enough to say “nearly one every other day.” But you grant me that.

    But Dante Arthur wasn’t killed. He’s not a UCR or officer-down stat. And of course we’re all happy for that. But his life-changing war-on-drugs injury (he got shot in the mouth) all but disappears from the public record after a few days in the Baltimore Sun. It would be great to have a database on prohibition violence, but we don’t have one.

    But the real issue you’re getting at is the circumstances of deaths and injuries. Fair enough.

    It’s a generally accepted figure in Baltimore that 80% of homicides are drug related. How do we come up with that. Well… yes, to some extent it’s just made up. But it’s based on experience and common sense and made up by homicide detectives. And it rings true. So grant me that 80% figure for Baltimore homicides if you will.

    Go to the UCR homicide supplement for 2006 (you could pick any year, but I just happen to have that file handy). There are 270 homicides listed for Balto. There were actually 276 murders that year, but that’s another issue.

    Run a frequency table for “Offender 1: Circumstance.” Narcotic drug laws are listed as the cause in 3 murders, or 1.1 percent of all homicides. 1.1 percent?! That’s a big difference from 80%

    At this point one of my favorite lines comes to mind, “What are you going to believe? Me or your lying eyes.”

    I think it’s safe to assume that a similar under-representation exists for the drug-related circumstances of officers killed.

    If two drug dealers are fighting and one kills somebody, that’s not listed in the UCR as drug-related. It’s an “argument over money or property.” If a cop is killed in a car crash responding to the scene, it’s listed as a motor-vehicle death. If another drug dealer is found dead along the way with no witnesses, the death is listed as “circumstances undetermined.” But it’s all drug deaths. The UCR doesn’t tell the whole story.

    If the UCR listed officers injured, Arthur Dante’s injury would not be listed as drug related. It would be listed as “arrest” or “other arrest.” And I simply don’t believe the UCR data on officers assaulted. I think they’re worthless (but that’s not for this post).

    I like your pie chart, but you’re not looking at the meaning of the data correctly. Of those 103 “traffic stops,” how many of those are drug related. I don’t know. But I’d guess 80-90%. Man wanted a drug warrant. Police trying to conduct a discretionary search of a car for drugs. Officers don’t get killed pulling over my mom.

    “Disturbances”? I’d guess about 1/3 with the rest being domestic violence (though probably 1/3 of those are drug-related as well). “Other” and “Other Arrest”? Probably half. “Ambush”? Maybe 25% (I keep thinking of those crazy white kooks killing people. Those are not drug related.)

    And I’d guess probably 10-15% of traffic deaths are drug-related. My friend Crystal Sheffield died in such an accident, trying to backup another officer involved in, yes, a drug-related dispute. But you won’t find that in the UCR.

    So put it all together and what do you have? A lot of prohibition and drug-related deaths. And there are multiple times more injured than killed in similar circumstances. We don’t put a number in the op-ed because we don’t have a number (maybe you and I could keep that database?)

    But from our experience and my participant-observation research, we both know (often personally) officers hurt and killed in the drug war. We both have a pretty good idea about how it fits into the total picture. So UCR data be damned!

    Writing a 800-word op-ed is different that writing an academic journal article. But I wasn’t and don’t play fast and loose with the numbers. It just so happens that the UCR numbers themselves play fast and loose with the facts.

    (and I do graciously accept apologies.)

  • Sentence Length [or lies from the Heritage Foundation]

    Sentence Length [or lies from the Heritage Foundation]

    In a Heritage Foundation foundation report by Charles Stimson and Andrew Grossman, I learned a very surprising fact:

    Convicted persons in the United States actually served less time in prison, on average, than the world average and the European average. Among the 35 countries surveyed on this question in 1998, the average time actually served in prison was 32.62 months. Europeans sentenced to prison served an average of 30.89 months. Those in the United States served an average of only 28 months.

    From “Adult Time for Adult Crimes: Life Without Parole for Juvenile Killers and Violent Teens”

    [Update/Correction:two hour later]

    I’m generally no fan of the conservative Heritage Foundation. In fact, just between you and me, I generally hate them and everything they stand for. But I wasn’t going to bring that up because I like to be tolerant and forgiving by nature. And if two of their researchers can write a good report, I’m more than happy to read it and learn.

    And though it’s rare to catch people in all-out balls-to-the-walls lies (though I’ve caught the DEA red handed on the issue of drug prices), there’s nothing too rare about academic and moral dishonesty.

    I decided to do a little fact checking, since, well, I didn’t really believe that our prison sentences were shorter. Plus I don’t trust the Heritage Foundation.

    [The actual Heritage report, by the way, is about why we should keep sentencing juveniles to life without parole. It seems like a strange cause to fight for. What do they chant at rallies? But that’s neither here nor there. I’m interested in the time people spend behind bars.]

    First read the above quote from the Heritage Foundation and think about what it means.

    Stimson and Grossman are not two fresh-faced grad students to be treated with kid gloves for bad statistical analysis. One is a “Senior Legal Fellow” and the other a “Senior Legal Policy Analyst.” And besides, they’re trying to influence policy and get more kids locked up forever.

    Plus their report claims to be all about getting the “facts” right. And much of their report resonated with the cop in me. And 10 pages of endnotes certain gives them the ersatz veneer of rigorous academic analysis.

    I copied the data (“Table 18.01: Average length of time actually served in prison”) to SPSS and crunched the numbers just like they did. Indeed, the average US sentence length is listed as 28 months and the mean length of time for the all countries listed is 32.62 months.

    But anybody who does basic stats–and if you can copy the data from a table into a stats program, crunch the numbers, and publish them, you had better know basic statistics–should see two red flags. First is the two-decimal result. The original data is rounded to the nearest month. Using two-decimal places implies a statistical precision but in fact is statistical nonsense. Besides, who really cares about 1/100 of a month (just about 8 hours)?

    The second red flag is the use of mean and not median for “average.” The difference between the two matters. “Mean” is the average in the sense of adding up all the numbers and dividing by the total number of numbers. The “median” is the point at which half the numbers are above and half the numbers below. Both “mean” and “median” are averages, but “median” is generally better for analyses of numbers that have a set minimum (often zero) on one side but are open-ended on the other side (as in, they can go up to a gazillion!).

    Take income. Medianincome is always lower than meanincome because the millionaires (the outliers at the high end) push the mean average way up. If next year everybody in the U.S. made $1,000 less but Bill Gates, one person, made a trillion dollars more, the meanAmerican income would go up by $2,000 per person! But the medianincome would go down $1,000, just like the average income.

    So if Stimson and Grossman used median, the average would go from 33 to 26 months and the U.S. would go from below average to above average. So if they’re using means, they’re either statistically ignorant of trying to pull a fast one. But no matter, I’m not going to spend time writing all of this for a difference of seven months.

    But wait… there’s more.

    2) Statistical outliers: Malcolm Gladwell didn’t invent them just to sell books. You generally shouldn’t include them in statistical analysis. The outliers here, in terms of sentence length, are Colombia, Qatar, Moldova, Latvia, and Suriname (with a mean of 90 months). Remove these four countries and the mean goes down to 23.5 months and the median to 19 months.

    Now sometimes “outliers” aren’t outliers but rather extreme case. If you’re talking about average world prison sentence length, you shouldn’t ignore America because there are more two million prisoners in America. But who cares if prisoners in Qatar serve 74 months? There are only 520 people in prison.

    Anyway, the difference between 19 months versus 32.6 months matters, but it’s still not what gets my goat.

    Oh, I’m just getting started.

    3) The table only includes 35 countries. Looking at each of these countries as equal for the purpose of statistical analysis is crazy. You’ve always got to apply qualitative common sense to quantitative analysis.

    Surinam? 665 prisoners in the whole friggin country!

    Montserrat? Montserr-who?! Where the hell is Montserrat!? What I’m trying to say is, who give a flying f*ck about Montserrat? What happens in Montserrat sure as hell must stay there because I didn’t even remember that the capital of this Caribbean island was buried in 39 feet of volcanic mud in 1995 and abandoned. The totalpopulation of this non-nation is less than 5,000!

    Give me a f*cking break. For statistical purposes, these countries doesn’t exist. The US has two-point-three-friggin-million people behind bars! Equating Montserrat with the United States is bullsh*t… and the authors of this report should know this.

    You ain’t seen nothing yet!

    4) “European average,” they say.

    Now call me crazy, or chauvinistic, or “Old-Europe,” but when I say “Europe” in terms of criminal justice policy, I mean–and I think most people understand me to mean–the rich civilized part of Europe that’s now part of the European Union. (By my calculations, Greece only joined Europe about 5 years ago.)

    It’s not just geography. It’s culture. This report counts Moldova as European. Technically, yes, Moldova is part of Europe. But technically Israel is part of Asia. And Egypt and Morocco are part of Africa. But I don’t see too many Arabs in my neighborhood calling themselves African-American.

    To say “European average” and give equal weight to (ie: not adjust for population) to Moldova and Germany is crazy. Oh, but wait, Germany and France aren’t even included in the data! How can you have a “European average” without Germany and France? No offense to Botswana and Mauritius (they’re on the list), but it’s not a world average if you don’t have Russia, China, Indonesia, or India!

    If you want to be honest, say 10 years ago Moldovan prisoners served more time than U.S. prisoners. But who gives a flying f*ck” about Moldova?! (Poor Moldova. I’m sure they’re very nice. In fact, it says right in their tourism website that Moldova is, “rich in fertile soil and in hardworking and caring people.”)

    And no matter which countries I count as European, I can’t duplicate the report’s average of “30.89” months. Seems to me the mean average for European countries included would actually be 34 months. But I’ll assume that was was just bad work rather than intentional dishonesty, since the correction would be in their favor.

    So let’s get back to the original question: do European prisoners serve more time than the U.S. average of 28 months? Here are some of the European countries listed:

    Denmark: 3

    Netherlands: 4

    Iceland: 5

    Ukraine: (yeah, what the hell, I’ll count the Ukraine as European): 5

    Finland: 8

    England and Wales: 14

    Portugal: 26

    Spain: 29

    I’d bet good money that Germany and France (which aren’t included in the data) fall somewhere between the Netherlands and England, with France being higher than Germany. That tends to be the way it is with those countries and criminal justice issues.

    So why all this type over something as minor as sentence length? Because I don’t like being played for a fool. Because I posted a lie thinking it was true. I posted it because the numbers really surprised me. I posted it because it went againstwhat I believed.

    I don’t like it when ideological groups spread lies. When people believe lies, and people tend to believe what they hear and read, the liars win. And liars, at least the ones that aren’t pathological, tend to have an agenda.

    Mind you, this is just the one paragraph I actually fact-checked. But coming from the intellectually empty and morally counterproductive Heritage Foundation, it shouldn’t have come as any surprise.

  • Baltimore Crime Stats

    Peter Hermann writes about playing with the numbers and the problem of accurate reporting.

    “I would suspect this goes on in most police departments,” Busnuk told me. “Others don’t have the crime problem that we do and don’t have the political pressure. But this kind of reporting is built into the DNA of the police system.”

    Kind of like how Detroit accidentally forgot to tell the FBI about 117 murders last year. Oh… those 117 murders!

    Those 117 Detroit killings are significant in that they push Balto from the not-so-coveted big-city homicide winner’s circle. Once again, Baltimore is number two and, in the words of some police, “shooting for number ones.”

  • Cooking the books?

    Anonymous posted a comment on the previous post: I can’t wait for the fudged numbers of the NYPD Comp-stat to be exposed…”

    Boy, there sure is a lot of chatter about the fudged numbers in the NYPD (and I’m talking about chatter from NYPD officers). I didn’t hear this nearly so much even just a few years ago. It seems that downgrading crime is becoming part of NYPD culture. And that’s a shame because it takes away from the hard work of the NYPD in actually decreasing crime.

    But I don’t believe the homicide numbers are fudged. According to the latest official crime stats (week of 4/13/09 to 4/19/09), there have been 109 murders this year compared to 142 at this time last year. That’s a 23 percent drop. That’s a real drop. That’s not playing fast and loose with the numbers. That’s saving lives.

    And if the other numbers go down in sync, the drop is probably real even if the numbers aren’t. Sure, maybe felony assault and grand larcenies are a lower than reality would indicate. But if you think about it, as long as the errors are consistent month to month and year to year, those errors don’t have much of an effect. The shame is that any effort put into lowering stats is kind of wasted because you have to keep cooking just to keep even. Once you start cooking the books, you can’t stop. At least not without what will look like a big one-time increase in crime.

    To police officers I offer this bit of unsolicited advice: call it like you see it. Nobody can make youdowngrade crime. Except when they do. Then write the facts as you believe them in the narrative and keep a separate list of notes documenting when, where, and who ordered you to do what.

    If the books are being cooked, one day it will boil over in scandal (and until then it chips away at a culture of honesty and integrity). And when the shit does hit the fan, the brass will cover theirs while throwing a few others under the bus.

    They’ll be covering theirs; you need to cover yours.

  • Cooking the books

    In 2007, the Kansas City Police Department reported a 22% drop in crime to the state (and to the FBI’s Uniform Crime Report). Now it turns out that crime actually went up 10%. The department basically blames this mix-up on a paperwork mess. Sounds fishy. But having worked in a real big city police department, I kind of believe them. It’s a mess in there.

    It is hard to overstate how completely overwhelming police paperwork can be. No matter what you do (except maybe if you deal with doctors, patients, and medical insurance), you have less paper work than police.

    If police had less paperwork, they could do more policing. Think about that the next time you call for more documentation of police work. Much police work willbe undocumented. All police work can’tbe documented. That’s the truth we’ve got to live with.

    Last year a different city reported low crime numbers to the FBI. After the crime stats got published and this city was reported as safe, the city “discovered” more stats and submitted them to be published in some unpublicized addendum. I don’t remember the city, but I don’t think this was an accident.

    Here’s the worrisome thing. With my ear to the ground, I smell smoke (how’s thatfor a mixed metaphor?). There’s more and more pressure in the NYPD to produce lower crime stats. That’s not necessarily a bad thing… as long as lower crime stats reflect lower crime. But I’m starting to think that stats and reality are less and less related. I didn’t hear this a few years ago. Now I do. It’s worrisome.

    The Compstat pressure to produce lower stats is overwhelming police ability to lower crime. A mid-level commanding officer gets a new asshole chewed out at a Compstat meeting. His numbers are too high. He tells his lower-level officers he needs lower stats. He doesn’t tell them to fudge data, of course, but funny things start happening. Reports start getting “reevaluated.” Or a foreign tourist is robbed (many tourists lie about being robbed, by the way, but that’s another story) and is leaving the country the next day. If the victim is gone, the suspect won’t be prosecuted. No victim, no crime. So the robbery is recorded as “lost property.” What’s the harm?

    As a Baltimore police officer, I never felt any pressure to downgrade crime. Nor did I ever downgrade grade crime for the purposes of lower crime stats. But based on my professional judgment and discretion, I downgraded crime all the time.

    There’s a fine line between common and aggravated assault. “Intent to cause serious bodily harm.” Who can say for sure?

    There’s a fine line between misdemeanor theft and felony theft. How much is your laptop worth? Are we talking current value or replacement price? That’s the difference between a bit of paperwork and a major Part-One crime.

    There’s a fine line between burglary and senility. What do you do if a “victim” “thinks” things are missing from his apartment?

    There’s even a fine line between “rape” and “failure to pay.” A prostitute says she was raped. When an officer reports a rape, a lot of gears and department resources start moving. Plus the victim needs to go to the hospital and get tested. Maybe the “victim” just wants her money. Or, as I once dealt with, her three winter jackets back that the John took (it was a cold night). I could have reported a rape. That’s what she first said. But then all these gears would have turned in the wrong direction and nobody would win (and she would be cold). Instead, I investigated, got the real story, and got her jackets back. I could have locked everybody up. Instead, everybody left happy (sort of). I thought it was good policing. Best of all, there was no paperwork.

    The point is there’s a lot of discretion in investigating and lots of gray in producing crime stats. Always has been, always will. This isn’t the problem. We need to acknowledge the gray, train and pay our officers well, hold them to high standards, and move on.

    The problem is when a police department systematically–or non-systematically but on a large scale–begins to change crime data to lower the stats. This is a hole out of which you can notdig. Every year you have to keep fudging the stats just to match the fudged stats from the previous year.

    There are only three ways out of this:

    1) You fess up after the data is released, published, and reported in the papers. Then you just hope the follow-up story gets less publicity.

    2) You get caught. You get fired. And your bosses (who directly or indirectly got you in this mess) get to gloat about how vigilant and angelic theyare.

    Or 3) you get promoted and your replacements are stuck with a huge “crime” increase to manage during their first year. But they can’t say too much. You are, after all, their boss.

    There is no great solution except to keep honest stats (“at least,” to quote H.L. Mencken, “within the bounds of reason”). It’s important to make clear from the top to the bottom that any stat fudging is not to be done.

    But if you de-emphasize Compstat, you’re losing one of the tools that helped bring crime down.

    If you bring in independent oversight, you bring in more layers of management and paperwork. Not good. (But having district commanders in charge of stat collection that can help or hurt their career is begging for shenanigans.)

    Maybe you need a special number for police to call anonymously just to report problems with stats. Remember, cops don’t like fudging stats; they do so because they feel they have to. Perhaps if two anonymous officers complain, a little internal audit begins. That could scare some people straight and give honest cops the reason they need to remain honest.

    For stats, you could focus solely on homicides. Homicide stats are much harder to fudge (but there’s still some room). But that only works in an area where there are a lot of homicides. What if the main problems in an area are quality-of-life issues? How do you measure these?

    There is no easy answer. There never is. But the first step to a solution is pointing out that there may be problem.