New terms or new trends?

The twitter buzz yesterday was about two posts by Stowe Boyd and Randal (Rand) Leeb-du Toit advocating Social Business as terms for Enterprise 2.0, along with the launches of new consulting services.

Names have value. Sector names are powerful totems in the consulting business. As a recovering industry analyst, I remember well the desire to name trends, and to have your name associated with the trend. And when innovation happens, it helps to be able to call it something. At Socialtext, we pioneered the adaptation of internet tools for collaboration and communication for business use, well before there were names for things. It was a lot harder to explain what we were doing before someone coined Enterprise 2.0.

In this case, the proposed change from “Enterprise 2.0” to “Social Business” is a lot less interesting than the trend the advocates are describing. Boyd and DuToit’s posts both describe a shift from an early market phase, when early adopters experiment with bringing web 2.0 tools and techniques to the organization, to a more mature phase, in which people have a more sophisticated understanding about evolving their organizations to take advantage of the tools, and the tools are deeply adapted to be suitable for organizational use.

So, switch terms? Shrug. Think more deeply and act more powerfully to adapt organizations and tools? A good thing to talk about and to do.

Fresh – the paradoxical story of seemingly natural food

Why were salads in aspic fashionable in the early part of the 20th century? Because of the recent spread of refrigerators in middle class homes. In the late 1920s, cookbooks published by the makers of Frigidaire and Kelvinator refrigerators promoted “refrigeration cookery”, while the Ladies Home Journal, the leading advertising vehicle for household technology, “ran recipes for pigs feet aspic and egg-and-asparagus modes, and urged readers to keep their fridges well-stocked with olives, capers, pimentos, and other colorful garnishes for luncheon salads and “jellied things.” Why did San Francisco become a center for graphic design, a role that’s continued into the 21st century with web design? It was the place where fruit crate labels were designed, when California’s fruit growers banded together to expand the market for oranges shipped long distance.

Fresh: A Perishable History, Susan Freidberg’s book on the social history of the concept of freshness, is chock-full of amusing and telling little details that illustrate stories of the definition of the concept of freshness in the modern age. The story is always more complex than it appears. The “fresh” salads that appear in grocery stores are delivered in high-tech polymer bags, variably permeable in response to temperature changes, and filled with nitrogen. The live fish that swim around the fishtanks of Hong Kong and mainland China are imported from the Phillippines and Indonesia, caught by shipped by air, anesthetized. The “fresh” milk that is purchased in supermarkets is the product of a system of regulation originally intended to preserve local markets but now a convoluted tangle; and a labor system dependent on migrant workers from Mexico who fearful to leave the farm and face capture and deportation. The concept of fresh food, which on its surface appears like the most natural and uncomplicated of attributes, is the product of complex webs of technologies, transport systems, legal structures and labor markets. The colorful, shiny surface of the consumer marketing of food is designed to hide the technological sophistication and the messy and often cruel realities of the production process.

This is a wonderful book that is entertaining even as it informs about aspects of life that many of us take for granted. My one quibble with the book is that the author attempts to maintain a scholarly equanimity about the unsavory and unsustainable aspects of the food system. The chapter on fish, for example, tells story after story of a given fishery stripped of its population, and the industry’s moving on to the next shoals. Unlike, say, the environmental tragedy told in Mike Kurlansky’s Cod, Freidberg’s matter-of-fact narration fails to consider the obvious and well-reported inferences about the ultimate conclusion of increasing consumption and declining fish stocks. Nor does Freidberg draw strong conclusions like long-form journalist Michael Pollan, whose Omnivore’s Dilemma offers a well-researched indictment of the health and environmental failures of the industrial food production system.

The chapter on the locavore movement, which is attempting to reconstitute markets and practices for locally produced food, doesn’t quite go to the false-equivalence depths of newspaper articles that dismiss the environmental value of local because some people produce hothouse tomatoes in Vermont in January. The environmental intent of the locavore movement is to reduce the fossil fuel cost of food trucked and shipped and flown around the world, not just to produce food as closely as possible to the eater regardless of environmental cost. But Freidberg is extremely skeptical of any efforts to improve the situation. Looking at previous failed efforts, ranging from Upton Sinclair’s slaughterhouse exposes to attempts to preserve local milksheds with counterproductive results, Freidberg concludes that any efforts to improve matters are likely as not to have negative unintended consequences, so by implication why bother trying? And since all systems have mixed natures, why try to evaluate what is better or worse?

I’m not wishing that Freidberg had written a manifesto, that would be a different book entirely. My ideal for the genre is Ruth Schwartz Cowan’s More Work for Mother, which by the way Freidberg quotes in places, a brilliant book on the social history of household technology with a steel skeleton of the history of technology adoption in the 20th century US. Without diverging into prescriptive politics out of place in a work of history, Cowan nevertheless has an underlying well-supported argument that household technology has served, for reasons of ideology and marketing, to maintain housework at a consistent level beyond need and value.

Even though Freidberg refrains from drawing seemingly obvious conclusions and making potentially supportable judgements, the robust material of her history provides a good foundation for those who may wish to follow the facts further toward reasonable conclusions, and who have less skepticism about the value of action in an imperfect world. And the counterintuitive tales of the unintended consequences of idealistic action, and the complex roots of seemingly simple things are good cautionary warnings about any simple story, whether it be the promise of a pretty supermarket package or the promise of any simple sustainable solution.

On the thoughtful use of points in social systems

Yesterday afternoon on Twitter there was a brilliant conversation among Kevin Marks, Tom Coates, Jane McGonigal, Tara Hunt, Josh Porter and a few others on the thoughtful use of points and competition in social systems.

Motivated by the popularity of games, designers of social systems sometimes adopt scoring and ranking systems simplistically, with counterproductive results.

The easiest design mistake is to throw up a “leaderboard” ranking all participants by a single dimension. Tom Coates explains that this simply discourages most participants. “Competitive charts, particularly at scale, basically are disincentives! Why compete to be #134,555th best at something!” Jane McGonigal agrees: “Cumulative allplayer scoreboards/ranks/achievements are fail, should be stopped like blink tags ^_^”

Tom Coates argues that systems that try to use extrinsic motivation is a sign that the activity isn’t compelling enough on its own. Flickr doesn’t need leaderboards to motivate actions, no need for competition. Nor Facebook. Or most blogs.” “There may be some role for play, but what motivates me to show my brother things or hang out with friends is not points!” Social systems should facilitate people’s intrinsic social motivations: “In social contexts we build on reasons people already interact with friends / peers / public. Company, sharing, showing off.”

Others in the conversation are less skeptical of points systems, and recommend ways to use them more wisely.

Instead of a single, discouraging ranking, Kevin Marks recommends using leaderboards for friends, with many publics applied to scores. “In gamelike situations comparing scores with peers + friends beats global ranks.” Jane McGonigal observes that social competition can be “rival” based (pick one person in your social network as your rival) like Sharkrunners.

Josh Porter takes time and the adoption cycle into consideration. “I see leaderboards as an early-stage igniter in social networks…good for a time of fast growth but not healthy long-term.” Another way time needs to be considered, says Porter, is the lifespan of scores. cumulative scoreboards eventually break (like Amazon having to reset Top Reviewers last year).

Jane McGonigal sees points systems as valuable, particularly for charting toward hard-to-achieve goals (eg nike+), the gamelike system for tracking running progress. But explicit metrics are only one element of designing a game that people want to play: “points” and “levels” without obstacles and an authentic non-game desire to progress/improve are awful.

Another consideration, I think, is participant temperament. Some participants will be more motivated by desire to be the best, or by social rivalry, or by individual achievement, and some will have social, noncompetitive motivations. Unless the designers explicitly want to select for a single personality type – say, a shooter game for the aggressively macho – designers need to consider the different temperaments in the participant community, and think about ways to support the motivations of different types of participants. Kevin Marks cites Bartle’s classic 1996 essay, Hearts Clubs Diamonds Spades on kinds of players in multi-user computer games.

This is an important public conversation to be having. The practices and principles of social design are starting to emerge from many experiences and experiments in networked social systems. It is a great time to reflect and learn from social systems in the world, and derive lessons for the thoughtful use of points systems (as well as other principles of social system design).

I’ve tried to capture the tweets from yesterday’s conversation on delicious using socialincentives tag, please feel free to add ones I missed.

Peninsula High Speed Rail Teach-In

My main question about the Peninsula Cities Coalition efforts to solicit public feedback on the California High-Speed rail plan, starting with a teach-in this past weekend, was whether feedback would have any impact at all. Coordinators included two local city council people, Terry Nagel of Burlingame and Yoriko Kishimoto of Palo Alto, seeking with good will to organize public input on a major public project. But the High Speed Rail Authority, the agency charged with building the high-speed route between San Francisco and Los Angeles, is an appointed body with a majority of its board members appointed by the governor. The mission of the agency is to get the project done, not to listen to residents. There isn’t any obvious reason that they would listen to the concerns of the people who actually live on the route the train will pass through.

The event gave me some cautious optimism that there was a way for public input to have an impact. Dominic Spaethling, a representative of the High Speed Rail Authority, mentioned that they will be seeking feedback from local governments in terms of how design choices will affect the local areas. And a representative of Senator Simitian’s office also sought feedback filtered through cities. The Peninsula Cities Coalition is a group of cities (five currently, Burlingame, Belmont, Atherton, Menlo Park, Palo Alto), and the series of events is an excellent venue for city officials to gather input from residents. Tactically, cities have local land use and permitting authority, so a good working relationship with cities is in the High Speed Rail Authority’s interests. So there may be a vehicle to funnel input through cities, and some interest on the Authority’s side to listen.

Another reason for optimism was the demeanor of Robert Doty, the Director for the Peninsula Rail Program, a combined program to develop Caltrain modernization and High Speed Rail. He has responsibility for interagency coordination and regulatory approvals, so on the ground, he’s a key person. He developed the highly successful Baby Bullet program for Caltrain. Doty seemed both practical and considerate of local concerns. For example, he acknowledged that the implementation of the BART-SFO connector was botched, in cost and design. He mentioned the 5 stairways, sounding like someone who’s tried to use them! And he seemed to have a good relationship with the local folk on the podium. By contrast, representatives of transit projects and agencies sometimes come across as high-handed, presenting a seemingly inevitable conclusion to their audience.

The day contained a number of panels and presentations, with various speakers representing different aspects and opinions of the project.
* Doty, as I mentioned, came across as no pushover, but as someone who was engaged with the community.
* Rich Tolmach, the California Rail Foundation, was opposed to the project, and is seeking ways to stop it.
* Dave Young, of engineering firm Hatch Mott Macdonald presented information explaining how tunneling might be practical. this is an approach that some hope will reduce impact long-term, and others dismiss as impractical.
* Greg Greenway, of the Peninsula Freight Users Group, advocated for continuing to use the corridor for frieght rail. However, the Caltrain corridor gets only about 10% of the freight that travels through the east bay, and compared to Oakland, the port of Redwood City is a tiny blip. It’s not clear to me that other design goals should be sacrificed to support what’s a tiny freight base on the Peninsula.
* Bill Cutler presented the Context Sensitive Solutions approach. This sounded like a fine process for gathering community input. But it was not at all clear how the Context Senstive Solutions process would dovetail with the technical and operational schedule for the actual working of the project.

Unfortunately, I was late and missed the first session by Gary Patton. According to the California High Speed Rail Blog, Patton talked about how the project can still be stopped.

My least favorite aspect of the day was the “Open Space” section. I say this as someone who has attended and coordinated many open space session and unconferences. These can be excellent in two very different ways. When there’s a general area of interest, such as digital mapping technology, it’s wonderful to have people with interest and knowledge have sessions on the topics they care about. It’s not infrequent for projects and other follow-ons to be spawned from great sessions. But there’s no requirement for follow-up. When there is a common goal, it is a good way to have people self-select into interested groups and develop next steps. In this case, there is a common goal, but there was no clear charter for groups to serve that goal. Someone from the Peninsula Cities group volunteered to collect the writeup, but there was no clear sense of follow-up. This part of the program could have been better designed. The session I was in was a brainstorming discussion about how to use online tools to support the public input process. I’ll blog more about that separately.

At the event, I had a chance to meet in person the author of the Transbay Blog, a blog that covers transportation and land use issues in the Bay Area, after meeting by blog comment and twitter. I also got to to chat with Robert Cruickshank, who writes the California High Speed Rail Blog, which advocates for high speed rail, as well as Andy Chow and Brian Stanke of the Bay Rail Alliance, a rail advocacy group.

These days blogs and advocacy groups are at least as important sources of information on issues such as this as traditional media. So far, the event has been covered by Palo Alto Online and the California High Speed Rail blog

Disclosures:

I helped the coordinating team for the event, including setting up the online event registration and helping with outreach. And I wasn’t able to make the meeting where the Open Space session was planned. I’ll also send the feedback to the coordinators, with suggestions about how to better use Open Space in this context.

I live in Menlo Park and work in Palo Alto. I cross the tracks at least twice daily by bike. I like walkable, bikeable, liveable neighborhoods with access to transit. I’m interested in having a much better system of regional transit, and in ways for our society to wean ourselves from fossil fuel. I lived in Boston during the big dig, and watched the city pay the cost of trying to undo a bad decision to build an elevated highway through town 50 years earlier.

Though I’m concerned that the structure of the High Speed Rail project makes citizen input more perilous, I believe in general that the likelihood of having an impact gets infinitely higher when you show up and try to make a difference in a practical fashion.

Social context for distributed social networks

I am glad to see the increased interest and discussion in distributed social networks. And it’s intriguing to think about the ideas described by Om Malik, Anil Dash, Dave Winer and others describing blogs as the home base and source for such distributed social networks. But blogs and individual interests aren’t enough to be sources and points of aggregation.

Social context, I think, will be a critical factor in the adoption of distributed social networks. The vision of a “universal identity”, and full “data portability” sharing everything with everyone, is much to broad, not simply for reasons of privacy, but of attention. It’s in no way a secret when I go bicycling, but only a small number of people will care about my bike routes.

So what do I mean by sharing in social context? Social context is the way that people think about what’s relevant to share to whom. If I have a photo to share from SouthBySouthwest, I want to share with others who went there (or are interested in it). The category of “photos” is too coarse-grained. The category of “friends” vs. “business” is too coarse-grained and in the context of SXSW makes my head hurt. We need to be able to define social context, and then share appropriately in the context.

A key reason why fine-grained sharing has been a failure til now is that tools ask users to make decisions based on content type (who do I want to share videos with) or by broad categories (are you my friend or colleague). (By the way, if you’ve successfully figured out facebook’s system, please explain it to me, I’ve tried and failed). I’ve written before about the need for decentralized profile data as a key piece of the distributed social network. Another key element, I think, is likely to be tagged activity streams. Within a given social context, it becomes pretty obvious what profile fields you wish to share or types of Tweets you want to share.

Fortunately, people already use ad hoc tags to define events and interests, and use these socially-defined tags to aggregate across tools such as flicker and twitter. However, this functionality isn’t very explicit or well-defined, so it’s hard to make it usable or automatable. I think that the practice of using tags to define social contexts, and usable tools to share information in those context, will become important. When tags become valuable, they also attract spam, so a layer of authentication and explicit group definition will be needed when spam becomes an issue.

Summary – if you ask someone what data elements they want to share with whom, in a general fashion, people will give up, overwhelmed. But when tools enable people to share profile information, stream updates, and content in social context, people will be able to make pretty good decisions. Supporting standards, features, and usability to enable sharing in context will help make distributed social networking real.

Tags for ActivityStrea.ms

I’ve been looking at the promising new ActivityStrea.ms proposed standard, based on the existing Atom standard. ActivityStrea.ms promises to enable sharing activity streams across applications. This is a key piece of the “distributed social network” vision, where people can distribute and aggregate their activities across multiple sites and tools.

I was thinking through a distributed community potential use case – and saw what seems to be an item that is missing but easy to add. The current ActivityStrea.ms proposal seems logically based on an assumption that a user would want to distribute or aggregate all of their updates. Or, perhaps choose to share updates based on *type* of content or action – I want to share my blog post updates, but not my photos.

What’s missing is the concept of syndicating or aggregating by *topic*. Let’s say I have a stream of messages talking about web standards, politics, and music. I want to syndicate only music-related updates to a music site. What’s needed seems simple – a way of adding a piece of metadata – a tag – that identifies a given update as a music-related update. Then, I can choose to share only those updates to the music site.

Fortunately, there are ways to represent Tags in Atom – see this representation in WordPress.. So it would be possible to address this need fairly simply, by adding the use of Tag as a recommended best practice for ActivityStrea.ms implementors. No need for a new standard, just leverage the existing one for this new use case.

As the uses of Activity Streams proliferate, and the social contexts in which they are used become more complex, people are going to want to choose discretion about what to share where — not just for reasons of privacy, but for reasons of attention management. Baking in a tag best practice could lay the groundwork for more socially useful Activity Stream sharing.

What do you think? Am I missing something obvious?

Agile social incentives

A poster at Codexon wrote a blog post bragging about the way he took advantage of the points system on Stack Overflow, the programming Q&A site. Reflecting on Adrian Chan’s recent post about the weaknesses of structured, game-like incentives in social media, I more-than-half-expected a situation where griefers were gaming the points system and messing up the community. I found something else instead.

Stack Overflow is a community for programmers to ask and answer questions. It has a detailed reputation system designed to reward participation and high quality responses. For those unfamiliar with the site, the points system well thought out with respect to the behavior it’s trying to foster really geeky, and rewards those who are feel motivated and amused by thinking quantitatively about the ways their constructive participation gains them more status and powers on the site. The design of the site and its reputation system hits all four attributes in Peter Kollock’s taxonomy of social motivations as cited by Joshua Porter: reputation, reciprocity (you can see who responds to whom), efficacy (it’s intended to reward fast and good answers), and attachment to group.

The reputation troll bragged about his techniques for racking up reputation points: answering quickly, regardless of the quality of response; downrating comments that are ahead of you, and using formatting to make your points stand out.

In the comments to the post itself, a good number of commenters called him out for obnoxious behavior, despite the poster’s insistence that he was merely gaming the system for his own entertainment and to point out its weaknesses. One example: “But isn’t that missing the point? I use SO and gathered some (+2000) rep but my main goal is to provide answers to actual questions and not to abuse the achievement system.” It was mildly encouraging in that the comments thread didn’t reveal a throng of trolls outing themselves for self-serving anti-social behavior. But it was only mildly encouraging. The overall tone in the conversation on the poster’s site was one of frustration that the poster is willing to go through the trouble to decrease the quality of information for the community in order to gain an essentially pointless reward.

Even more interesting was the “meta conversation” on the Stack Overflow site itself. There, participants analyzed the troll’s behavior and identified what about the reputation gaming tactics were actually destructive to the community. In practice, posting a quick low-quality response is not that harmful, since other people quickly comment with better quality responses that get up-rated, and the original low-quality comment will float down below the fold. On the whole, adding formatting and images to posts is a good thing, since the visual emphasis makes the content easier to understand.

The one thing that site participants saw as truly harmful was the strategic downvoting of others’ comments in order to have one’s one comments increase in value. Jeff Atwood, aka @codinghorror, the site’s lead developer, commented on this point, saying this is the one thing they are considering changing the algorithm to discourage.

In reaction to the griefer, you can see the community assessing its own practices and identifying an area to improve. The developer with the power to make changes is participating in the conversation and resolving to make changes to protect against the problem. Watching the StackOverflow community react to an antisocial participant suggest something that is as important in a social system as any particular rule or feature – the ability to evolve the rules.

One of the agile practices that the Socialtext development team uses is the retrospective. We produce software in two week iterations. At the end of each two week period the team reviews the iteration – how people feel about it, what worked well, what needs improvement, and we identify items to improve. For example, we observed that the review of stories for the upcoming iteration had a tendency to fall through the cracks. So we tweaked the use of wiki page tags, which serve as a lightweight workflow reminder to identify when a story is in good enough shape for review.

What’s important here is not the specific process we use, or the specific improvement the process, but the ability of the team to reflect, identify a problem, make a change to address the problem, and assess whether the solution is working. It does help to use lightweight tools that can easily be changed, e.g. define a tag that can be applied when a story needs review. Unlike the StackOverflow community, our team does not calculate and display the team’s metrics on an individual basis – we’re striving for team goals to deliver software that meets customer’s needs, when we said we’d do it. So we look at the team data explicitly, and handle individual variance informally. The point is that we have a system to fit the culture, and we can evolve the system to address problems.

So, in response to Chan’s post, it may matter less what sort of feedback system is used – implicit or explicit, numeric or social – and it matter’s more that the community itself is able to change the rules.

Neal Stephenson on the decline of genre

Thanks to @dbschlosser’s link on twitter last week, I listened to Neal Stephensons’s lecture on the decline of genre. As a writer of what can variously be called speculative fiction and good books, his primary interest is in the fate of the genre and community he’s been associated with. Though I agree with the thesis that genre as we know it is in decline, I have some different perspectives on the nature and causes of the change.

Stephenson sees speculative fiction as fundamentally about intelligence – books and movies about smart people; and about exploring the impact of ideas. The genre has become increasingly mainstream, since it is increasingly cool to be a geek, an intelligent person with an informed passion. Stephenson makes it a point to describe intelligence outside of the framework of social class, of going to a brand-name school, of the signs of high culture; an informed passion for machine shop metal work is also geekery and good.

But there are some key aspects of the transformation of genre that Stephenson doesn’t address. The primary transformation over the last 10-15 years is in distribution, in marketing, and in the creation of publics and communities to engage with art. The existence of a broad “mainstream” and identified “genres” was related to older techniques of marketing and distribution. Powerful, expensive mass marketing was used to promote the biggest hits. More targeted marketing was used to reach narrower but still broad demographic categories of buyers. And, of course, physical distribution in bookstores meant that books needed to be shelved in one place, grouped with other books that people in the audience category would be likely to buy.

Given the limitations of mass media and the more targeted niches of mass media, the categories of audience were broadly demographic. Mainstream movies have tended to be segmented by gender, there are “chick flicks” about relationships targeted at women and “action films” about violence targeted at men. Music in the US was segmented in an invidious fashion by race, with white and black radio stations, and the categorization of similar musicians into differing genre shelves based on melanin. The emergence of internet distribution and the “long tail” means that the formerly cartoon-broad marketing categories are no longer applicable, and the allocation of physical shelf space is no longer relevant. People are free to describe content outside of marketing categories, and to organize themselves in groups that may or may not bear a resemblance to the groupings created by marketing departments.

Around culture in general, and speculative genres especially, fans have an easier time finding each other, creating large and active communities around Harry Potter, the Lost tv series, and much more. There have always been associations of fans; the internet makes it much easier for like-minded fans to find each other and the bond around fictional worlds and other art.

In Stephenson’s talk, he takes a few swipes at the “postmodern” schools of cultural theory, where critics call into question the ability of artists to control their material; related disciplines examined that lack of control with the lenses of gender and politics. Now, postmodern critics can swim slowly in a small barrel. I went to college at one of the hotbeds of postmodernism. It was rather common for grad student teaching assistants and undergrads flaunting scarves and cigarettes to claim that the text deconstructs itself, therefore imply strongly they were smarter than shakespeare. This was annoying. I avoided really engaging with the ideas until my senior year and then after I graduated. The extreme views of the junior disciples notwithstanding, the postmodernists and their economic and political cousins had some valid points.

Stephenson talks about the disappearance of the Western as a genre; the simplest explanation is the decline of social confidence in the “cowboy and indian” narrative. Fewer people were sympathetic to stories about heroic european people fighting native americans. Cultural criticism would identify the pattern. Stephenson makes a really insightful point about crime getting absorbed into television because of the good fit of detective stories to episodic structure. He makes a much less compelling point, I think, about romance being absorbed into everything – there’s still a big divide between chick flicks and action flicks – though I can’t talk about this in huge detail because those are the mainstream hollywood movies that I don’t go to, in part because of lack of identification with either broad gender stereotype. So, another argument explaining why sci-fi themes have broad appeal is that they operated outside the narrow confines of hollywood gender stereotypes.

Stephenson makes fun of the post-modernists, saying that it’s ridiculous to think that, say, Heinlein was not in full control of his material. But Heinlein is notorious as an old-fashioned pre-feminist whose female characters and gender relationships reflected stereotypes. The classical writers of science fiction, who wrote about colonists exploring other planets and experiencing tensions with the beings they found; world-threatening conflicts; male heroes with buxom heroines; in a world with the cold war, colonialism, and sexism, were tightly bound to social structures they could not clearly see. The postmodernists had some valid points.

So, I think that changes in technology, economics and social structure have at least as much to do with the decline of genres as they were constructed 50 years ago.

The value of interface design patterns

Amy Hoy writes a provocative blog post, “screw interface patterns” arguing that following interface patterns leads to boring designs. The thing is that originality is a primary goal only when you are making art. And even artists – especially artists – rely on some combination of established structure and variation – the genius is choosing what to keep familiar and what to vary.

When you are building software that people use, conventions are especially important. People need to recognize the objects and functions, otherwise they will get confused, frustrated, and go away. If a design element isn’t exactly what users are familiar with, it at least needs to be learnable. A completely original signup pattern might be creative, but it’s probably not good design if your users can’t get in to use your software.

Also, in some cases, the existing established patterns are actually unhealthy. For example, in public web applications, using a different username and password per site, and forcing users to enter usernames and passwords for other sites when using integrated services. In this case, adopting OpenID and Oauth are new good patterns to adopt. A design pattern writeup, in this case, can argue for a new solution in place of an old one.

Unless your goal is to clone another existing piece of software, you’re probably trying to add some kind of new behavior, something different from what exists already. At Socialtext, we do this all the time when we seek to adapt social software patterns developed in the public internet for use in companies and organizations. When you do this, you need to make decisions all the time about what conventions to keep, based on the models you’re using, and what new designs to add. For example, when creating a social network for use in organizations, explicit “friending” doesn’t make much sense – what does it mean to “friend” your boss or colleage. We left it out, and implement asymmetrical “following” instead. A key part of the value of design patterns is to help you think, with some richness and nuance, about the items you want to keep conventional vs. the items you want to vary.

The Amy Hoy article criticizes design patterns as if they were intended to be used like a textbook to cram for a test – learn the answers to repeat in the book. But that’s not the goal of patterns, and I’d argue that if anyone uses them that way they’re doing it wrong. One of my favorite quotes: “It’s not a religion, it’s just a technique.” Design patterns are tools for designers to make their own decisions, not an AI module intended to replace decisions.

I am a big fan of the O’Reilly Designing Social Interfaces project (disclosure: I was a reader of the manuscript, but I was a fan first). Not because I agree with everything in the book, but because I use the material to help assess which conventions to use and which to vary.

Test-first development for voting system certification?

At the OSCON session on Hacking Open Government, Secretary of State Debra Bowen talked about the mismatch between the process of certifying voting systems, the changing nature of voting requirements, and the goal of open source voting software.

Currently, voting systems need to be certified in order to be used in elections. The certification process entails submitting code to a testing agency that keeps the code, tests, and results proprietary. The Secretary of State’s office has access to the data. Citizens don’t. The testing process is long and cumbersome. This imposes a significant barrier to new entrants, including open source voting systems. When new requirements are added, the system needs to be re-certified. This imposes a long delay on the adoption of modifications.

This testing process is based on a model that is older than current best practices for software design. The testing process is based on a “waterfall” fall method where software is developed, and testing is done, all in one piece after the fact.

Current best practices are different in a number of ways.
* Software is developed incrementally, and testing is done continuously, as the software is built.
* Tests are written before the software is developed. Tests serve as the detailed specification for the way the software is intended to function
* Tests are written incrementally. New tests are added to govern new behavior.
* There are automated test suites that verify that the system continues to pass tests, with old and new behavior

This suggests a different process for voting system certification.
* Tests are made publicly available. Detailed tests serve as specifications for the behavior of the voting system.
* There is an automated test suite that continually tests the behavior of voting software.
* New functionality can be added to systems and tests incrementally. Tests will verify that the system continues to function correctly, for old behavior and new.
* Results of tests are publicly available.

Using an incremental, test-driven process for voting system development and certification would improve the reliability of the process, by enabling more scrutiny. It would shorten the time needed to introduce new voting system improvements. And it would lower the barrier to new entrants, including open source systems.

This testing would cover only functional behavior of the system – are votes counted correctly, does the administrative process work. There is still a need for security and penetration testing, which goes beyond the function of the code, includes all aspects of the system, including physical security, authentication practices, data integrity, and more. And there is still a need for usability testing – which as far as I know is not yet part of voting system certification. Usability problems result in a larger portion of day-to-day voting system failure than technical failures, although technical failures can have disastrous results.

Still, opening up the functional testing process, and running it incrementally, seems as though it might offer significant benefits.

For practitioners of modern software development and testing – what do you think about this suggestion? Are there any big gaping holes that would make this nonsensical or unfeasable? Feedback most welcome.