Buzzword Watch

Ever since Network for Good launched back in 2001, the modifier “. . .for good” has become ubiquitous. There are computer scientists for good, search engines for good, magazines for good—the only thing I haven’t seen is “Evildoers for Good.” As a buzzword, the phrase “. . .for good” has so pervaded our vocabulary as to become genre defining, like the role of love in pop music or car crashes in action films. At the risk of buzzword overdosing, I think it’s fair to say that “. . .for good” is the uberbuzzword of the social economy. I’ve clustered the following list into “buzz-topics” that align with the broader themes of this Blueprint.

PHILANTHROPY AND THE SOCIAL ECONOMY
While all of the buzzwords on this list matter, this group comes directly from within the sector. Some of the terms draw from longstanding debates about metrics and efficacy; others draw from emergent ideas about how best to dedicate private resources for public benefit. 

Overhead Myth
The overhead myth is the name given to an oversimplified measure that uses administrative costs as a meaningful indicator of organizational effectiveness. Charity Navigator and other websites aimed at informing donors perpetuate this measure, even as they often include small-print qualifiers on its limited value. In the last few years a coordinated response to debunk the attention given to administrative costs has gained significant traction, leading to a bit of a rhetorical/behavioral standoff. Nonprofits, foundations, donors, and charity ranking sites all discourage attention to overhead cost ratios even as they continue to report them. Nonprofit organizational costs are like rubbernecking; we know we shouldn’t look, but we just can’t help ourselves.

Effective Altruism
Not to be confused with effective philanthropy, the effective altruism movement has its roots in utilitarian philosophy and a modern-day spokesperson in Princeton professor Peter Singer. If the movement needed a bumper sticker it would be, “Do the most good,” the idea being that we should seek rational calculations for the greatest returns for our charitable gifts and actions. Oxford University’s Centre for Effective Altruism has helped spread the ideas among university students. Proponents and detractors abound. Like it or lump it, effective altruism offers intellectual shape and a set of principles to the long-brewing but inchoate attention on metrics, data, and outcomes. See also X-Risks (below) and the idea of “unicorns”— Silicon Valley–speak for companies that reach valuations over $1 billion while still privately held—they’re a big win for investors. The old term for unicorn in the philanthropic sense might be “silver bullet.”

X-Risks
Shorthand for “existential risks,” these are the biggies—the things that could wipe out humanity. A report from the Global Challenges Foundation listed 12 terrifying possibilities ranging from artificial intelligence to catastrophic climate change to pandemics to synthetic biology. Each one of these forces could wipe out current human populations and preclude any potential offspring—wiping out the species known as people. The likelihood of catastrophic climate change is great enough that cost-benefit calculations argue for taking steps now to prevent it.

Platform Cooperativism
What if Task Rabbit were owned by the rabbits? Or drivers owned Uber? Drawing from centuries of common practices, many cultural definitions of shared property, the cooperative movement, and the split reality of platform-enabled work, a new interest in platform cooperativism is emerging. The goal is to create tech platforms owned by those who build and use them. It’s one more sign that the future of work is . . . in flux. When the cooperative enterprise structure meets high tech (see Loom.io, Ethereum, and the Enspiral Network) it’s a good sign that new governance models may be on the horizon for the social economy.

SCIENCE, EVIDENCE, AND INTEGRITY
Some big foundations and proponents of effective altruism are demonstrating real interest in evidence and science. As they do so, several of the “terms of art” that keep science and research moving ahead are gaining traction in the philanthropic sphere. Nonprofits and foundations have been talking about and sometimes honestly trying to deal with failure in more productive ways than just looking the other way. Science is, of course, built around failure—the scientific method relies on generating a hypothesis, running experiments, learning from failure, generating new hypotheses, running more experiments, and so on. Confirming scientific findings relies on others being able to replicate your work, which in turn requires scientists to share methods and data.

Worm Wars
We’re all familiar with philanthropy’s growing interest in randomized control trials and evidence-based social practice. (See Blueprint 2014 buzzword, randomista.) But what if the scientists don’t agree? This is what happened when research studies that seemed to show the effectiveness of deworming medication on young people’s educational and health indicators were replicated and. . . the results varied. The resulting battles over the science were dubbed the worm wars. The alliterative name helped attract media attention. The more philanthropy seeks to rely on evidence, the more it’s going to find
itself caught on methodological battlefields. Just ask any climate scientist, real battles being fought. (Replication would be the less alliterative alternative for this buzzword nominee.)

Retraction
Retraction is what happens when a scientist’s purported findings cannot be replicated by anyone else. The highest-profile recent retraction case involved a psychology study that purported to document attitudinal changes about gay marriage if the political canvasser asking the questions identified as gay. The journal Science retracted the study, and the philanthropically supported Center for Science Integrity drew a lot of attention for its RetractionWatch website. Might we see foundation-funded research (outside of the academy) begin to bear scrutiny at this level? Publication Bias What if scientists only announced successful experiments? Their peers would be left to stumble through all the mistakes that had been made in the past, and learning and experimentation would be slower for everyone. Yet, this is what has happened over the years as journals and professional advancement subtly shift attention toward “publishing what works” and not what failed. The phenomenon was first brought to public attention around drug trial studies, but the practice and the concern extend far beyond just pharmaceutical research. Alternatives include funding journals of failure and a philanthropically supported effort to make data from all experiments open for review.

Publication Bias
What if scientists only announced successful experiments? Their peers would be left to stumble through all the mistakes that had been made in the past, and learning and experimentation would be slower for everyone. Yet, this is what has happened over the years as journals and professional advancement subtly shift attention toward “publishing what works” and not what failed. The phenomenon was first brought to public attention around drug trial studies, but the practice and the concern extend far beyond just pharmaceutical research. Alternatives include funding journals of failure and a philanthropically supported effort to make data from all experiments open for review.

INFOTECH AND DIGITAL
Every year’s buzzword list needs to consider the latest tech jargon. Here are three terms that we’ll be hearing frequently in the coming year, each representing a technological advance that brings both promise and peril.

Algorithm
We’ve learned to think about data, now we’re now realizing we also need to think about the algorithms by which we analyze or manipulate the data. Who’s creating them and how do they amplify existing biases? What, if any, recourse do we have if algorithms discriminate? The truth is all the data and analysis we’re now capable of isn’t making things simpler or more straightforward. Instead, they’re demanding a new kind of data literacy, giving rise to new sorts of “data intermediaries” and requiring new forms of oversight and interpretation.

Augmented Reality
The Oculus Rift and other virtual reality headsets get a lot of attention, but these are still a generation away from adoption by anyone who doesn’t want to walk around wearing what look like black-tinted ski goggles. But augmented reality—in which digitized information appears in view alongside the real world—is already here. Cars with directions projected from the GPS to the windshield are one example. We already spend hours everyday staring at our phones; soon we’ll be pointing them at everyday objects (and other people) and getting all sorts of information about whatever is in view.

Thing Hacking
Fifty billion connected devices equals the Internet of Things (see Blueprint 2015 buzzword). Devices packed with as much software as your desktop or phone means they can be hacked, just like your desktop or phone. In July 2015, hackers disabled a car traveling at 100 kph on a public highway. The good news is that we know cars can be deadly so regulators and manufactures are moving faster than they did before to address these security issues. The bad news? Now you’ll have to ignore the Terms of Service on your toothbrush, just as you’ve always done on your phone apps.

BIOLOGICAL TECHNOLOGIES
I’ve been watching robotics and biotech as proximal areas of change for the social economy. The Insights section discussion of the future of work captures social sector implications of work in robotics, artificial intelligence, and deep learning labs. As biotechnological advances move out of the lab and into our lives, ideas and innovation in these fields will begin to creep into our work and our jargon.

Biononymity
It’s not just cameras, building ID card scanners, and license plate readers that are tracking our every move. As DNA analysis gets better and cheaper, our lack of “biological anonymity” is coming to the forefront. Artists use “found” DNA from stray hairs on subway cars and lipstick taken from tossed-out coffee cups to create remarkably accurate drawings and threedimensional representations of commuters who have passed by. Lawyers, artists, biologists, and technologists are coming together in an informal network known as biononymous.me to proactively consider the implications of this creepy new reality.

CRISPR
What if someone could cut and paste genetic material with the equivalent ease of word processing? A new system for genomic editing—specifically cutting and pasting “clustered, regularly interspaced, short palindromic repeats” (CRISPR)—now exists. The technology is the subject of both major scientific and corporate battles, but its influence comes from its low cost and widespread availability. While we’ve been focused on digital hacking, gene hacking is about to become a real possibility. It’s entirely likely that biological systems are about to follow a similar trajectory of de-institutionalization, “freelance science,” and hard-to-regulate
spaces that have marked the last decades of digitization.

Takeaways are critical, bite-sized resources either excerpted from our guides or written by Candid Learning for Funders using the guide's research data or themes post-publication. Attribution is given if the takeaway is a quotation.

This takeaway was derived from Philanthropy and the Social Economy: Blueprint 2016.