Pages

1:19 AM

Pacific Standard. Smart Journalism. Real Solutions.

Pacific Standard. Smart Journalism. Real Solutions.


Texans Didn’t Make Houston Great

Posted: 26 Jul 2013 02:33 PM PDT

downtown-houston

In the war for talent, less social capital is more. Rust Belt cities are wound too tightly. Case in point, Louisville, Kentucky:

“[In Houston], connections are really important to get my resume through the door. But I felt like in Louisville, if I didn't know somebody in Louisville it was going to be harder to get a job. And I didn't think they valued my education as much as people did here.”

Why is that?

“Houston has more companies. So in terms of job searching it made it easier for me to figure out what I wanted by having more choices, which obviously a smaller city wouldn't have that. But they did seem more open to recent graduates than Louisville did. I felt like the big companies in Louisville weren't as interested in recent graduates. They were more interested in experience.”

Louisville is more parochial than Houston. In abstract terms, the strength of the network is within the community. Within Houston, those same connections are weaker. That network is more outward facing, “more open to recent graduates.”

A city with more outsiders is more likely to attract more outsiders. As more outsiders arrive, they push out locals. Those damn carpetbaggers ruined Houston. Yankee go home!

Texans didn’t make Houston great. Birthplace diversity, not density, fuels prosperity. The model of choice isn’t compact New York City, but sprawling Los Angeles:

As shown in table 2, only 27.5 percent of Los Angeles adults were born locally, that is, in California. This contrasts to local origins for 57.6 percent of New Yorkers and 60.5 percent of Chicago residents. Among Washington, DC, residents, 34.5 percent were born locally. Not surprisingly, the two high-growth regions have many more migrants from outside the area.

Emphasis added. Houston is becoming more like Los Angeles, which appeals to someone tired of her hometown. She wants to bowl with strangers. She can’t do that in Louisville.

Why You Should Care More About the Drugs Your Doctor Prescribes

Posted: 26 Jul 2013 12:00 PM PDT

medicine-pills

Your doctor hands you a prescription for a blood pressure drug. But is it the right one for you?

You’re searching for a new primary care physician or a specialist. Is there a way you can know whether the doctor is more partial to expensive, brand-name drugs than his peers?

Or say you’ve got to find a nursing home for a loved one. Wouldn’t you want to know if the staff doctor regularly prescribes drugs known to be risky for seniors or overuses psychiatric drugs to sedate residents?

For most of us, evaluating a doctor’s prescribing habits is just about impossible. Even doctors themselves have little way of knowing whether their drug choices fall in line with those of their peers.

Once they graduate from medical schools, physicians often have a tough time keeping up with the latest clinical trials and sorting through the hype on new drugs. Seldom are they monitored to see if they are prescribing appropriately—and there isn’t even universal agreement on what good prescribing is.

It was clear that hundreds of physicians across the country were prescribing large numbers of dangerous, inappropriate, or unnecessary drugs. And Medicare had done little, if anything, about it.

This dearth of knowledge and insight matters for both patients and doctors. Drugs are complicated. Most come with side effects and risk-benefit calculations. What may work for one person may be absolutely inappropriate, or even harmful, for someone else.

Antipsychotics, for example, are invaluable to treat severe psychiatric conditions. But they are too often used to sedate older patients suffering from dementia—despite a “black-box” warning accompanying the drugs that they increase the risk of death in such patients.

The American Geriatrics Society has labeled dozens of other drugs risky for elderly patients, too, because they increase the risk of dizziness, fainting, and falling among other things. In most cases, safer alternatives exist. Yet the more dangerous drugs continue to be prescribed to millions of older patients.

And, as has been well-documented by the Los Angeles Times and others, powerful painkillers are often misused and overprescribed—with sometimes deadly consequences.

As reporters who have long investigated health care and exposed frightening variations in quality, we wondered why so much secrecy shrouds the prescribing habits of doctors.

The information certainly isn’t secret to drug companies. They spend millions of dollars buying prescription records from companies that purchase them from pharmacies. The drugmakers then use the data to target their pitches and measure success.

But when we tried to purchase the records from the companies that supply them to drug manufacturers, we were told we couldn’t have them—at any price.

We next turned to Medicare, a public program that provides drug coverage to 32 million seniors and the disabled and accounts for one out of every four prescriptions written annually.

We filed a Freedom of Information Act request for prescribing data. After months of negotiation with officials, we were given a list of the drugs prescribed by every health professional to enrollees in Medicare’s prescription drug program, known as Part D.

What we found was disturbing. Although we didn’t have access to patient names or medical records, it was clear that hundreds of physicians across the country were prescribing large numbers of dangerous, inappropriate, or unnecessary drugs. And Medicare had done little, if anything, about it.

One Miami psychiatrist, for example, wrote 8,900 prescriptions in 2010 for powerful antipsychotics to patients older than 65, including many with dementia. The doctor said in an interview that he’d never been contacted by Medicare.

A rural Oklahoma doctor regularly prescribed the Alzheimer’s drug Namenda for patients under 65 who did not have the disease. He told us it was because the drug helped calm the symptoms of autism and other developmental disabilities, but there is scant scientific support for this practice.

Among the top prescribers of the most-abused painkillers, we found many who had been charged with crimes, convicted, disciplined by their state medical boards, or terminated from state Medicaid programs for the poor. But nearly all remained eligible to prescribe to Medicare patients.

If you or a loved one were a patient of one of these doctors, wouldn’t you want to know this?

We have now taken the data and put it into an online database that allows anyone to look up a doctor’s prescribing patterns and see how they compare with those of other doctors.

This information is just a start. It can’t tell you if your doctor is doing something wrong, but it can give information that allows you to ask important questions.

For instance, why is your doctor choosing a drug that his peers seldom do? Does your doctor favor expensive brand-name drugs when cheaper generics are available? Has your doctor been paid to give promotional talks for drug makers?

And we’d like to see the day when all prescribing by all health professionals—not just in Medicare—is a matter of public record.

It’s not only patients who benefit when medicine is more transparent. Doctors too can gain by comparing themselves to their peers and to those they admire. Clinics can see how their staffs stack up. And researchers can track patterns and examine why doctors prescribe the way they do.

One doctor told us that after studying our online database, he cornered his colleagues and peppered them with questions about their prescribing. Most, he said, were surprised when he told them their drug tallies.

Many aspects of doctors’ practices remain private. The number of tests they order and procedures they perform. The number of times they make mistakes. These data could help inform the public, too.

In the meantime, arming yourself with prescribing information allows you to be more active in your health care or that of an aging or disabled loved one.


This post originally appeared on ProPublica, a Pacific Standard partner site.

How to Ease Inequality on the Cheap

Posted: 26 Jul 2013 10:40 AM PDT

daycare

The president wants Congress to expand daycare access to every child in America, an expensive and politically complicated proposition according to critics. But much less intensive pre-k support for moms could still produce major windfalls down the road for their babies, claims a new working paper from the World Bank. The authors found that once-a-week, one-hour visits from a childcare aide for mothers and kids in poor Jamaican communities resulted in nearly 50 percent higher earnings and improved mental health when those babies became adults—20 years after only a two-year intervention.

Findings are consistent with a well-established body of evidence that suggests early childhood interventions like daycare might be a better investment than the stock market.

In 1986, a group of researchers enlisted 129 nine-to-24 month-old babies from Kingston, Jamaica, all with poverty-stunted physical growth. The babies were split into four randomized groups: one group received food supplements (things like baby formula and cornmeal) every week; another group received nothing; a third group of kids and their mothers received one-hour weekly sessions with a child development aide that had received a few months of training in primary health care, teaching methods, and toy making; a fourth group received the aide's visit, as well as the extra food. The study team monitored things like the babies' years of schooling, IQ, and mental health (and, eventually, their earnings) throughout the first two years, then every few years until age 18. In the late 2000s, the World Bank paper authors—including James Heckman, Nobel laureate from the University of Chicago who has figured prominently in debates over daycare—picked up the thread, re-interviewing these study subjects around age 22.

You might think starving kids that got extra food saw the best outcomes later in life. But previous surveys of the kids' lives established that the food aid had no long-term effect on health and earnings. On the other hand, the impact of the one-hour weekly visits from an aide appeared far greater: Average earnings for 22-year-olds whose moms participated in those weekly sessions were 42 percent higher than the control group. But kids who received an aide's intervention didn't just leave behind their stunted-growth peers—their earnings actually caught up to a group of children from the same neighborhoods that grew up less impoverished and received no such intervention. The aides' efforts appear to have nipped starting-line inequality in the bud.

So what was the aides' secret? They relayed a little knowledge about childcare to the moms, and encouraged them to talk to and to play more with their kids. They also tried to boost the self-esteem of the child and the mother through praise. Pretty basic stuff, but the authors were willing to infer causality, not just correlation, between these weekly interactions and kids' improved outcomes: Part of the initial study looked at the parents' level of engagement with their babies outside the time spent with aides, finding that that little weekly boost inspired more engagement for two years, then enthusiasm dissipated. The kids nevertheless continually outpaced the control group and even caught up with their non-stunted peers.

It's worth noting that, like the few American studies that have undertaken long-term, randomized efforts to track outcomes for kids in programs like Jamaica's, the sample size was small. Another caveat: Every child in the study was given free health care. Absent that expensive component, a broken leg and a stack of huge medical bills might negate the benefits of teaching a mom how to mother.

But the findings are consistent with a well-established body of evidence that suggests early childhood interventions like daycare might be a better investment than the stock market. And improved educational attainment, mental health, and earnings for young adults 20 years after a once-a-week check-in seems like a lot of bang for your buck.

How the Zimmerman/Martin Case Hurt Race Relations in the United States

Posted: 26 Jul 2013 10:00 AM PDT

zimmerman-trial

A single event can take on great symbolic importance and change people's perceptions of reality, especially when the media devote nearly constant attention to that event. The big media story of the killing of Trayvon Martin and the trial of George Zimmerman probably does not change the objective economic, social, and political circumstances of blacks and whites in the U.S. But it changed people's perceptions of race relations.

A recent NBC/WSJ poll shows that, between November of 2011 and July 2013, both whites and blacks became more pessimistic about race relations.

29 117

Since 1994, Americans had become increasingly sanguine about race relations. The Obama victory in 2008 gave an added boost to that trend. In the month of Obama's first inauguration, nearly two-thirds of blacks and four-fifths of whites saw race relations as good or very good (here's the original data). But now, at least for the moment, the percentages in the most recent poll are very close to what they were nearly 20 years ago.

The change was predictable given the obsessive media coverage of the case and the dominant reactions to it. On one side, the story was that white people were shooting innocent black people and getting away with it. The opposing story was that even harmless looking blacks might unleash potentially fatal assaults on whites who are merely trying to protect their communities. In both versions, members of one race are out to kill members of the other—not a happy picture of relations between the races.

My guess is that Zimmerman/Martin effect will have a short life, perhaps more so for whites than blacks. In a few months, some will ascend from the depths of pessimism. Consider that after the verdict in Florida there were no major riots, no burning of neighborhoods to leave permanent scars—just rallies that were, for the most part, peaceful outcries of anger and anguish. I also, however, doubt that we will see the optimism of 2009 for a long while, especially if the employment remains at its current dismal levels.


This post originally appeared on Sociological Images, a Pacific Standard partner site.

Same-Sex Marriage and the Liberalizing Effect of the Jewish Faith on the Democratic Party

Posted: 26 Jul 2013 08:00 AM PDT

judaica-items

Over the past few years, we have witnessed a dizzying shift in the politics of same-sex marriage. The general public has become more accepting of the practice (66 percent of millennials now support it) and political elites, particularly in the Democratic Party, have shifted toward more equal treatment of marriage rights. In May of 2012, just days after Vice President Joe Biden revealed his own change of heart on Meet the Press, President Obama confirmed that his stance had evolved from opposition to support for same-sex marriage. Hillary Clinton, widely regarded as a prospective candidate for a 2016 presidential nomination, also recently publicly switched her position, perhaps in an effort to match those of her potential rivals for the nomination, including New York Governor Andrew Cuomo and Maryland Governor Martin O'Malley.

These examples mirror a virtual deluge of politicians changing their position on same-sex marriage; 55 members of the 113th U.S. Senate now support marriage equality, up from just 16 at the end of 2010. While many observers attribute this shifting landscape to a change of political strategy rather than a sincere change of heart, there is a religious pattern to the trend: The earliest adopters of what would become the consensus marriage equality position of the Democratic Party were disproportionately its Jewish members. Though discussions about the influence of religion on American politics typically tend to focus on the relationship between evangelical Christians, the Christian Right, and the Republican Party, the Jewish faith has an unmistakably liberalizing effect on the Democratic Party, including LGBT rights.

pearson1

Using a combination of Washington Post reporter Dylan Matthews' recording of when senators adopted their positions and our own review of the historical record, we analyzed the relationship between senators' religious affiliations and when they changed their position on same-sex marriage.

The first senator to ever publicly endorse marriage rights for gays and lesbians was Ron Wyden (Oregon), a Jewish Democrat who articulated his support as a candidate for the Senate in 1995, a remarkable 17 years before President Obama would do the same. Wyden never switched his position, at least not on the public record; he ran for office endorsing marriage equality. However, after his election, two other senators followed his lead and endorsed same-sex marriage in 1996: Ted Kennedy (Catholic) and Carol Moseley-Braun (also Catholic). After that, it was a very slow march toward support for marriage equality in the U.S. Senate.

When Senator Dianne Feinstein (D-California) changed her position on marriage equality in 2008, the number of supporters in the Senate finally reached 10. Of those 10 senators, five were Jewish. By the end of 2010, 16 senators supported marriage equality, and of those 16, seven were Jewish. At the end of 2011, 30 senators were in favor, and of those 30, all 11 of the body's Jewish members were on board.

pearson2

There is currently a clear religious division regarding support for same-sex marriage in the Senate. Not surprisingly, Jewish and unaffiliated senators are in favor and evangelical Christians are opposed. Of the 16 evangelical senators currently serving, only one, Jon Tester (D-Montana), endorses marriage equality. The other major religions have some diversity of opinion. Fifty-five senators now openly support marriage equality (50 Democrats, three Republicans, and two Independents); 18 of those are Jewish, unaffiliated, or Buddhist (Mazie Hirono, D-Hawaii). The remainder of the Senate is made up of 33 mainline Protestants, 18 of whom support marriage equality; 26 Catholics, 16 of whom support equality; and seven Mormons, two of whom support marriage equality.

Partisan divisions over the issue of same-sex marriage, while stark, may obscure the religious foundations of elected officials' positions. Jewish politicians gravitate toward the Democratic Party in part because of their cultural liberalism, while evangelical Protestants have abandoned the party in part due to their conservative positions on cultural issues.

pearson3

As we have shown elsewhere, while religion influences the selection of a party in the first place, it also has an impact on which positions gain consensus within each party's caucus over time. Jewish senators were the first to embrace the movement for marriage equality, eventually bringing many of their Gentile colleagues along with them, which seems to have solidified the party's position. On the other side of the aisle, we expect that evangelical Christians will be the last holdouts in opposition to marriage equality, thereby anchoring the Republican side of the issue.

Looking ahead, additional support for same-sex marriage in the Senate likely will require either the persuasion of additional members from moderate religious groups, such as mainline Protestants and Catholics, or the defeat and replacement of evangelical Christian senators by candidates from religious groups that are friendlier to LBGT rights.

Politicians undoubtedly consider many factors when deciding which side of an issue to support. Though social movements, campaign donors, and the majority opinion of their constituents surely enter into their calculations, they also look within to their own values, which are informed in part by their religious beliefs. For support for same-sex marriage to gain such momentum in the Senate in a matter of just a few years, there had to be a critical mass of senators who were willing to go there first. And it was the Senate's Jewish members who led the way.

Do Television Shows Like ‘CSI’ Deter Cybercrime?

Posted: 26 Jul 2013 06:00 AM PDT

CSI-LV

Legal experts and behavioral scientists have gone back and forth over the years about the so-called "CSI effect," about whether jurors have been led by fictional TV programs to have unreasonable expectations of the forensic evidence used in actual, real-life crime investigations.

The theory goes, in a crime drama like CSI, handsome nerds in lab coats produce unequivocal results in the time between one commercial break and the next. Real forensic scientists work slowly, and often present their findings as theories rather than as facts. Likewise, the most technologically advanced forensic labs can suffer from contamination or human error; even fingerprint-identification is more of an art than a science.

Cybercrimes will probably continue to occur, and law enforcement agencies will continue to crack them, at steadily increasingly sophisticated levels.

Just last week, The Washington Post reported on "an unprecedented federal review of old criminal cases," including 27 death penalty convictions, in which FBI forensic experts may have "exaggerated" their testimony about the scientific evidence they had analyzed for those cases. The review has revealed the widespread and long-standing limitations in one type of forensic analysis—hair comparison—a revelation that has huge repercussions for thousands of federal and state cases.

Clearly, forensic science has its gray areas. Yet it's hard for some jurors to get past the glamorized version of the process, and so they expect black-and-white certainty: hence, the oft-cited "CSI effect." But other legal researchers have dismissed this notion as mere anecdotal pop science. So the jury's still out on that one, so to speak.

But aside from whatever impact these types of shows may have on juries and the general public, what about their effect on real-life crime? Criminals watch TV, too, right?

A new study being released in the International Journal of Electronic Security and Digital Forensics suggests that TV crime dramas might have a measurable effect on cybercrimes in particular. The study's author, informatician Richard Overill of King's College London, came to this conclusion after analyzing patterns in cybercrime data in the U.S. over the past 11 years. (The first CSI series premiered in 2000, and Miami and New York spin-offs followed.)

According to Overill, there are several ways in which potential criminals might adjust their modus operandi to evade the kinds of "unambiguous" and "instantaneous" forensic sleuthing they see on television. He writes:

They are likely to withdraw from cyber-criminal activity that now appears too risky in the light of the perceived ease of discovery. They may migrate to alternative modalities involving many layers of concealment, stealth and obfuscation. The up-front investment required to implement these advanced methodologies will necessitate a proportionate increase in the expected returns, in order to maintain a stable cost-benefit ratio. Thus we would anticipate a compensating increase in the average value of cyber-crime heists, accompanied by a migration to sophisticated strategies of concealment.

Translation: a lot of would-be criminals are probably scared away from these types of crimes by what they imagine to be easy detection by law enforcement. So, there may be less cybercrime overall than there would have been, had crime dramas been less infatuated with cyber exploits as their go-to plot points. But it also means that the criminals that are willing to go ahead with cybercrime will be smarter, more well-funded, and generally better equipped to evade detection.

For example: the group of Russian and Ukranian hackers who perpetrated the largest hacking and data breach in the country, who, federal prosecutors announced on Thursday, attacked NASDAQ and several online retailers and ultimately did hundreds of millions of dollars worth of damage? Those guys are definitely pros. On the other hand, they did get caught, although it took eight years for that to happen.

As I mentioned in my last post about a new crop of digital filters being developed to combat child pornography, it's all an arms race. Cybercrimes will probably continue to occur, and law enforcement agencies will continue to crack them, at steadily increasingly sophisticated levels. Whether TV shows dissuade some dummies from trying their hand at that dangerous game probably won't make much of a difference.

No comments:

Post a Comment