Buzzword Watch 2017

Ah, the buzzword watch. What phrases and ideas will seep into our work lives in the coming year, corrupting our vocabulary and making jargon-haters cringe? All my usual caveats apply. Being a buzzword doesn’t make something good or bad or denote a fleeting or a lasting idea; they are just buzzwords (or are they)?


Foundations and nonprofits used to talk about issue areas, sectors, domains, or fields of interest. Now they talk about ecosystems. The intent is to capture the interdependent network of enterprises, laws, infrastructure, people, and tools that influence and shape each other. Think of the “mobile ecosystem” as the complete set of apps, devices, telecommunications infrastructure companies, software code, and legal requirements that determine how our cellular phones work.


Software that encrypts all the files on a computer system, allowing the “data kidnapper” to hold it hostage until a ransom is paid. Ransomware attacks became almost common in 2016, and many of the victims were not-for-profit hospital systems and community clinics.


The number of refugees worldwide is at an all-time high. People fleeing war, repression, and the effects of global warming number in the tens of millions. Many of them are digitally dependent, with their mobile phones serving as metaphorical and literal lifelines. Refugee tech includes two broad categories of innovation: digital tools that won’t make people any more vulnerable to racism, xenophobia, or government oppression; and digital tools for wayfinding, job creation, skill building, and other key necessities for building a new life in a new place.


The United States government changed its rules on overtime work in 2016. Nonprofit organizations— operating on lean budgets and often at the mercy of government contracting rules—once again found themselves squeezed between values (a decent living) and reality (no funds). Labor markets, shaped by demographic shifts, regulatory changes, on-demand work, and automated contracting, are changing rapidly. Civil society continues to be buffeted from all sides by fundamental changes in the ways we work.


The rhetorical battle between autonomous machines and autonomous people meets at the point where system designers discuss putting “humans (or society)-in-the-loop.” It’s technical slang for requiring that at some point in a computational process—such as in self-driving cars, predictive algorithms, or even mobile phone–based mapping programs—a person (or society) takes charge. Some systems have many such points. And as computational processes and humans interact ever more frequently in ever more “real world” ways, the need to build societal norms into the loop—group values and group defaults— becomes ever greater. Think of it this way: We’ve put plenty of computational processes (“the loop”) into our daily lives, and this buzzword reminds us it’s well-nigh time to start designing our daily lives back into those processes.


Giving to charity isn’t the only way to use your private financial resources to influence public change. Increasingly individuals are “giving to politics,” especially in the U.S. As far as I can tell, David Callanan at Inside Philanthropy coined the term “hybrid donors” to name those who deliberately, and with some strategy in mind, give to both. Sometimes this hybridization is institutionalized, such as with the creation of large-scale LLCs that can make political contributions, charitable gifts, and impact investments. Sometimes it describes individual donors, who use the rules of both political giving and charitable giving to achieve their visions of public good.


How do you give away digital data? Data trusts are one solution, a new form of organization that focuses on governing the agreements between data providers and data users. There are examples built around aggregated public data being managed for use by program evaluators (Justice Data Lab at New Philanthropy Capital). ArtStor or JSTOR are examples of the digital management of copyrighted works. LearnSphere, a repository of student data and research methods at Carnegie Mellon, is an example focused on serving researchers. Look for both more examples and more refining of the rules in the year(s) to come.


More and more software in more and more aspects of our lives means ever more ways to collect data on our individual behavior, store it somewhere, and seek ways to monetize it. As long as business models depend on the monetization of large quantities of individual data, there will be incentive for software to default to privacy invading. One option, for those with money, is to pay for a software version that spies less. This is already common: just think of every app that offers a free version with advertisements or a paid version without. The more pervasive this kind of software is, the more our “privacy inequality” will come to mirror income inequality.


In 2015, the online crowdfunding platform GoFundMe moved more than one billion U.S. dollars. Most of the money moves directly from donors to recipients, with little involvement in between by nonprofits. One large-scale example of this came after the mass shooting in an Orlando, Florida nightclub in June of 2016. Within weeks of the horrible event, more than $7 million had been raised and routed directly to families and individuals.44 Nonprofit organizations that have typically managed such campaigns or provided intermediary relief services need to better understand this phenomenon, its motivations, and its implications. The rest of us need to think about what types of transparency and fraud abatement measures we should expect of these crowdfunding platforms.


As they “train” software to learn patterns of behavior, scientists and designers often turn to existing data sets as raw material. If the existing data set is biased, the computer will learn— and likely reinforce at ever greater speed and scale—the original biases. Examples have been found in social media facial recognition software that don’t register black skin, hiring sites that discriminate against female-sounding names, and criminal justice algorithms trained on racially biased historical data about incarcerated populations. We don’t need software to help us discriminate faster and more broadly, yet that seems to be a lot of what we’re getting. Countering this trend requires a diversity of data and system designers and a refusal to accept “black box” decision making.

Takeaways are critical, bite-sized resources either excerpted from our guides or written by Candid Learning for Funders using the guide's research data or themes post-publication. Attribution is given if the takeaway is a quotation.

This takeaway was derived from Philanthropy and the Social Economy: Blueprint 2017.