_________________________________________________________________

Sunday, December 6, 2015

Why I voted against ThoughtExchange

At our last meeting, the board voted 4-3 to enter into a three-year contract with ThoughtExchange, an online platform to solicit ideas and opinions from district residents. The contract cost $106,462. I voted against it, for several reasons.

The cost alone was not my primary objection. I can imagine issues on which a well-done survey of community opinion would be worth paying for. My concerns about ThoughtExchange were:

1. Although ThoughtExchange was presented as a way to “take the temperature of the community”—the vendor even referred to it as “polling”—it was not at all vetted for that purpose and is very clearly not up to the task.

There is a difference between tech expertise and statistical expertise; the vendor provided no information about the statistical capability of ThoughtExchange to measure the opinions of the community as a whole. He probably couldn’t, even if he tried. Participation is not random; some users might make one quick visit while others might visit repeatedly and participate for long stretches; it’s fairly easy for people to have multiple accounts; and the total number of participants on any given issue is likely to be a small fraction of the total community population. As a result, the margin of error, if it could even be calculated, is likely to be so enormous that the results would tell us very little about community sentiment.

That problem is compounded by the fact that a significant chunk of our community (estimated at about six percent of households) does not have regular internet access, and that’s probably not a random chunk, but skewed toward low-income residents.

2. I’m concerned that the motivation to use ThoughtExchange is more about putting on a show of community engagement than actually engaging in a meaningful way. (I had these same concerns about ThoughtExchange’s predecessor, MindMixer.) There’s no point in asking the public for input if we’re not willing to adjust our decisions accordingly once we get it, but it often seems like the district wants to do the former and not the latter. In those instances, people just feel worse than if their input was never solicited at all.

The district’s likely strategy is not to ask any questions that it doesn’t want to hear the answers to, and to word the questions in ways designed to push participants in certain ways. Three years ago, I made fun of the district for using MindMixer to ask, “What are the school district’s biggest strengths?” Then, when the ThoughtExchange vendor made his presentation, one of his examples of a question that could be asked was, “What are some things you appreciate about your school this year?”

Unfortunately, my (1) and (2) correspond to the two things we’re actually paying for (that we wouldn’t get from engagement through, say, the district’s Facebook page): (1) the “data” analysis (which is of little value if the data is not representative of the community as a whole), and (2) the manipulability and control that comes from being able to decide what questions to ask and how to ask them. We should not pay for either of those things.

It’s hard not to see ThoughtExchange as primarily a public relations campaign posing as a concern for community input. The board should consider whether that might put off as many people as it attracts. As commenter Amy Charles wrote, “No, do not tax me in order to build a case for more taxes. Spend the money on the frigging schools, and do it sensibly.”

3 comments:

Amy Charles said...

Regrettable. Chris, may I suggest that when the contract comes up for renewal, assuming you're still engaged in this exercise, you enlist the help of Doug Jones, in CS, whose main work recently has been in voting-machine software. He's very good at carving up the useless and identifying pretense, and is also extremely helpful in suggesting sensible alternatives to whatever he's just taken apart. (He's also fairminded, which means that if he sees the software does have advantages, he'll say so.)

The silliness of this purchase does leave me wondering what else we've been sold, or are about to be sold, that's more serious. I have in mind, of course, the trouble many districts, driven by the usual pressures, have gotten themselves into over the last decade or so via investments they didn't understand and bond issues they couldn't afford. Who advises the district on these things, do you know?

Amy Charles said...

Incidentally, you're no doubt aware of this, but the K12Insight survey about the charter schools (plz) has no IP tracker stopping people from filling out the form multiple times. I just did. As far as I can make out from K12Insight's promo materials, the district doesn't see the IPs, which would mean we have no idea who just sat there holding down "yes" and "no" keys, effectively. Which in turn would make the survey meaningless.

Although for all I know, that's the point. Who's running this joint, anyway?



Amy said...

Hey, question about this charter-school survey: I'm sure the contractor sends a digest to the district; since we the people are paying for this thing, where will we be able to see all the data collected? Not a digest, but boring old page after page after page of individual responses, so that we the people can do our own data analysis?