Systematic Review of Recent Social Indicator Efforts in US Coastal and Ocean Ecosystems (2000–2016)

in Environment and Society

ABSTRACT

After Hurricanes Sandy and Katrina, governmental organizations have placed the development of metrics to quantify social impacts, resilience, and community adaptation at the center of their agendas. Following the premise that social indicators provide valuable information to help decision makers address complex interactions between people and the environment, several interagency groups in the United States have undertaken the task of embedding social metrics into policy and management. While this task has illuminated important opportunities for consolidating social and behavioral disciplines at the core of the federal government, there are still significant risks and challenges as quantification approaches move forward. In this article, we discuss the major rationale underpinning these efforts, as well as the limitations and conflicts encountered in transitioning research to policy and application. We draw from a comprehensive literature review to explore major initiatives in institutional scenarios addressing community well-being, vulnerability, and resilience in coastal and ocean resource management agencies.

The demand for social indicators has increased exponentially over the past 10 years. Hurricanes Sandy, Ike, and Katrina and the Deepwater Horizon oil spill have shown important gaps in how institutions prepare and respond to extreme events. The premise that social and economic indicators provide valuable information to help decision makers address the complexity characterizing socioecological systems, and most critically the threats of anthropogenic-induced environmental change, has led several interagency groups to undertake the task of embedding social, behavioral, and economic tools into policy and governance. As a result, the US Department of the Interior (DOI), the US Environmental Protection Agency (EPA), the Federal Emergency Management Agency (FEMA), the National Institute of Standards and Technology (NIST), and the National Oceanic and Atmospheric Administration (NOAA) have all prioritized the development of metrics of community vulnerability, resilience, and well-being (Biedenweg et al. 2014; Hicks et al. 2016; McBain and Alsamawi 2014).

While the challenge of measuring and predicting social conditions in the context of climate change is unprecedented, the task of developing social indicators is not a new one. Over the past 50 years alone, two independent but successive programs have proposed social indicators as means of addressing important issues in economic development, social welfare, and environmental sustainability. Initially, both indicator programs experienced considerable support, inaugurating prolific areas of research and academic inquiry in their respective fields. Few of these efforts, however, consolidated within institutional settings, and in the span of two decades, the projects abruptly declined (Badham 2009; Brown and Corbett 1997; Innes and Booher 2000; MacDowall et al. 2016).

The intent and scope of the third more recent social indicator movement has been innovative. It has steered toward a more comprehensive treatment of community vulnerabilities and adaptability by relying on a suite of spatial, behavioral, and social science techniques that remained unavailable or were largely unknown before. But this new effort is, much like its predecessors, part of a larger tradition that connects states with the quantification of the social dimension and the formulation of policy, processes, and institutions. Independent from partisan affiliation, the indicator movement is driven by a broader policy agenda that invites academics and scientists to revisit social measurements and the provision of social intelligence to develop management tools (Wong 2003). In its scientific dimension, the program not only strives to answer immediate policy needs originating across different local, state, and federal agencies but also responds to changing political priorities that are formulated in the context of an administration. In that respect, the indicator movement designs and implements processes of centralization and decentralization of information; it legitimizes and creates institutions in the form of working groups and regulatory frameworks (Wong 2003). Because it is tied to policy cycles at the government and international levels that lack continuity, the indicator movement suffers from intrinsic limitations that undermine its success (Innes 1989).

As quantification approaches continue to move forward, proponents of social indicators recognize the risks and challenges in reconciling research and theory with policy. Calls have been made for more transparent practices in how theories, frameworks, and methods are selected among new indicator tool sets, or even data sets. However, the question remains as to what extent the current emphasis on indicators can capitalize on prior knowledge to avoid common missteps in setting expectations and goals and in recognizing its role within a policy agenda.

In this article, we adopt Jo Ellen Force and Gary Machlis’s definition of social indicators as “an integrated set of social, economic, and ecological measures collected over time and primarily derived from available data sources, grounded in theory and decision making” (1997: 371). We use this definition because it captures what we consider one of the main shortcomings in previous indicator efforts: the lack of an explicit theoretical framework that is institutionally embedded from its inception within a policy setting. Our main motivation is to explore the current government initiatives addressing the development of social indicators in the context of ocean and coastal resource use and management. The article discusses the major rationale underpinning these efforts, as well as the limitations and challenges encountered while operationalizing notions such as resilience and adaptability in policy. Relying on a literature review of gray and white documents originating in government settings over the past 15 years, it examines the use of indicator frameworks and the appropriateness of metrics according to stated policy and social needs. The article is structured as follows: the first two sections explore the historical and institutional context behind social indicators development in the United States along with major indicators’ frameworks. The next section provides a characterization of the methodology and sources used for the analysis. We then present main findings from content and multidimensional scaling analyses. Finally, the last section introduces a discussion of these results in lieu of the current environmental and management challenges and in the context of evidence-based governance.

The Demand for Social Indicators

Research on the socioeconomic and cultural conditions of societies has been a matter of interest to governments for well over one hundred years (MacDowall et al. 2016). With the consolidation and systematization of state bureaucracies during the 1700s and 1800s, demographic statistics and actuarial science began to address taxation and insurance needs by studying the physiognomy and morphology of certain sectors of population. The professionalization of social disciplines and methods and the generalized climate of conflict and social discontent prevalent in the late eighteenth and nineteenth centuries also contributed to the consolidation of a numerical approach to societies (Cobb and Rixford 1998; Hacking 1990; Innes 1975). The quantification of social phenomena was perceived as a way to address government-level objectives related to fiscal accountability, social engineering, and planned development (Foucault 2007; Scott 1998). By the beginning of the twentieth century, most countries in Europe and in North America had established statistical bureaus and carried out their first comprehensive population censuses. In the early decades of the 1900s, the relationship between social policies and statistics continued to further evolve. The first set of community indicators was developed by the Russell Sage Foundation in 1910, and the Recent Social Trends in the United States report was published in 1933 by US President Herbert Hoover’s Committee on Social Trends (Atkinson et al. 2002; Brown and Corbett 1997; Noll 2004).

Despite this larger tradition in social metrics, the term “social indicator” was not formally introduced until 1966 (Innes 1989; White 1983). With support from the National Aeronautics and Space Administration (NASA), Raymond Bauer published a compendium on social indicator studies that summarized much of the work being conducted at the time (Land and Spilerman 1975; Maloney 1968). Through this volume, Bauer made a concerted effort to formalize a research program for social indicator work in the United States that could explore the social ramifications of technological progress (Ferriss 1979). The field included researchers such as Otis D. Duncan, Kenneth C. Land, and Eleanor B. Sheldon (Innes 1989; White 1983), and rapidly captured the attention of other academics and policy makers (McBain and Alsamawi 2014). Within the program, advocates of social indicators saw metrics as a way forward in addressing societal problems that had remained unattended by economic statistics (Andrews 1989; Andrews and Withey 2012; Innes and Booher 2000). Along with foretelling the changes of technological development, there was an interest in understanding issues in social cohesion and welfare in the prosperity of the postwar era (Cohen 1969). The new type of social information produced by indicators would serve as a “yardstick” to assess progress and ultimately improve life quality through the design of more effective policies and goals (Bauer 1966; Innes 1989).

Because of its simplicity and the strong sense of social commitment it elicited, the idea of adopting indicators spread to both national and international scenarios (Noll 2004). Multilateral organizations such as the United Nations (UN) and the Organisation for Economic Co-operation and Development (OECD) launched comprehensive social indicators and demographic programs that became the standards for further social statistics work. Within the United States, the desire to institutionalize a new system of social bookkeeping led to the introduction of the Full Opportunity and Social Accounting Act of 1967 by Senator Walter Mondale (Sheldon and Freeman 1970). President Lyndon Johnson’s request for a new social indicator program that could supplement the information provided by the Bureau of Labor Statistics and the Council of Economic Advisers was met with the release of three reports examining social conditions of the nation throughout the 1970s (Cohen 1969). Several other undertakings were made at the institutional level, such as the appointment of a permanent center by the Social Science Research Council to conduct studies on indicators. In the academic sector, new lines of research were inaugurated or rejuvenated as unprecedented funding support was made available. A new journal, Social Indicators Research, was created in early 1974, and important methodological volumes on indicator development were published (Atkinson et al. 2002).

Despite this high level of activity, the political consolidation of the program in the United States would encounter important challenges during Richard Nixon’s and Ronald Reagan’s presidencies. The tightening of public expenditure and the redirection of the government to a market-based approach in the early to mid-1980s placed social policy under the purview of transnational organizations (Wong 2003). The result of the structural shift in responsibilities affected the support for the social indicator agenda, bringing the movement to a steep decline. The reduction of funding and government interest that enabled the decentralization of social intelligence was attributed to different causes (Andrews 1989; Innes 1989; Innes and Boother 2000).

Among them was the realization that many of the goals set by the movement were far from being reached (Land et al. 2011). Perceptions that social indicators could revolutionize policy and “lead to better social systems based on knowledge about the strength and weaknesses of current social programs” (Andrews 1989; Andrews and Withey 2012: 3) quickly contrasted with a stalling economy and the tense Cold War politics. The lack of results on the social and political front was a reflection of the unrealistic expectations that proponents of the movement had about the role of information in enabling change (Cobb and Rixford 1998). The optimism that characterized the previous two decades of indicator work stemmed from a strong focus on empiricism and descriptive data (Sawicki 2002). But this emphasis on inductivism and the generation of information should not be equated with technical measures that are neutral in theoretical and political orientation (Gergen 1973; Green 2001). Many different types of assumptions about what a society should be like or what constitutes normality were embedded in the choice of indicators, methodological tools, and dimensions of measurement that guided the scope of work of the movement (Badham 2009; Force and Machlis 1997; MacDowall et al. 2016; Noll 2002). Furthermore, assumptions about the role of science in informing policy and social reform were “simplistic” and often naïve, without clear policy targets or objectives, and providing no guidelines for the implementation of findings into actions (Innes 1989; Wong 2003). In all, the emphasis on developing new methodologies of measurement, the absence of explicit theoretical frameworks, and the tenuous connection to decision makers ultimately resulted in large amounts of data but no new policies (Brown and Corbett 1997). With large collections of statistical facts comprising thousands of pages, but few attempts at eliciting causality or explanations that might increase their usability, the capacity of indicators to answer the needs of social accounting and policy evaluation systems was questioned (Innes and Booher 2000).

In the 1990s, indicator approaches reemerged in the context of environmental conservation (Wong 2003). During the previous decades, key pieces of legislation such as the National Environmental Policy Act (1969), the Magnusson-Stevens Act (1976), the Water Quality Act (1965), and the Clean Air Act (1963) were passed as environmental degradation and pollution hit the public eye. These policies instituted specific requirements in the assessment of societal and environmental impacts as a result of management actions. By the 1980s and 1990s, the NOAA, the DOI, and the EPA began to develop and incorporate Societal Impact Assessment, or SIA, into their programs. However, the practice was not widespread, lacking systematicity and continuity (Pollnac et al. 2006). With numerous lawsuits won because of insufficient assessment protocols, declining stocks of critical fishery resources, and the extinctions of key charismatic species, agencies began revising their own approaches to accountability in the mid-1990s (Smith et al. 2011). The Rio Earth Summit of 1992 was another influential force in the resurgence of metrics, with an explicit call for indicators to address sustainable development in the context of environmental resources (Wilson et al. 2007).

The second wave of indicators became known as the “community indicators movement” (Sawicki 2002), for it included a new interest in the participation of social groups and citizenship at large (see President George W. Bush’s Executive Order 13352 of 2004). Whereas earlier approaches were directed at the investigation of trends and social conditions in the context of welfare and life quality, the second cycle of indicator development was inspired in concrete decision-making needs, performance evaluation, and the creation of sustainability within broader community issues. The difference in approaches between these two cycles, however, was only partial, as both programs shared common goals and limitations (Martínez and Dopheide 2014; Wong 2006). In terms of its shortcomings, the new movement failed to build from previous experiences, yielding hundreds of different approaches, with weak conceptual and methodological bases, and insubstantial connections with policy (Wilson et al. 2007; Wong 2003).

With the devastating impacts of Hurricanes Katrina and Rita (2005), Hurricane Ike (2008) and, more recently, the Deepwater Horizon oil spill (2010) and Hurricane Sandy (2012), questions about the vulnerability and resilience of communities directed the attention of President Barack Obama’s administration. Indicators have once again been featured in the conversations of management agencies at the local, regional, and federal levels, as well as among researchers and nongovernmental organizations dealing with coastal and marine issues. The US government has made a commitment to the incorporation of social, economic, and behavioral sciences to inform decision making and to develop actionable information (Executive Order of 15 September 2015), with a stream of financial and political support directed to the understanding of what makes coastal communities more resilient to future climate change effects, the inclusion of ecosystem services into policy, and the adoption of socioeconomic metrics of well-being, vulnerability, and adaptive capacity in national assessments (see, e.g., the National Climate Action Plan of 2013 and the National Ocean Policy Implementation Plan Appendix of 2013, the Memo M-16-01 on Incorporating Ecosystem Services into Federal Decision Making). This has resulted in the creation of several interagency working groups such as the Interagency Working Group on Ocean Social Science (IWG-OSS) and the NOAA/FEMA/NIST collaboration Mitigation Framework Leadership Group (MFLG) to compile and draft community resilience indicators. For example, the IWG-OSS has been tasked with the review of social indicators in coastal, ocean, and great lakes management as outlined in the Implementation Plan Appendix of the National Ocean Policy (2013).

In addition, expert workshops on social indicator development and social sciences integration such as those sponsored by the US Global Change Research Program and the NOAA’s Coral Reef Monitoring Team have been convened to provide guidelines and recommendations on domains of measurement (Lovelace and Dillard 2012). The Disaster Relief Appropriations Act of 2013, referred to as the Sandy Supplemental, has provided critical funding through the DOI for 150 projects to explore resilience. It has also led to the creation of the DOI Metrics Expert Group (DMEG), which is tasked with recommending metrics for resilience assessment and the determination of needs and gaps (Abt Associates 2015). In the near future, the RESTORE Act is expected to allocate funds to the restoration and the enhancement of coastal community resilience and local economies in the Gulf of Mexico. Federal funding opportunities have already invited applications for the “identification of currently available health/condition indicators of Gulf of Mexico ecosystem components, including humans, followed by comparative analysis of strengths and weaknesses and design/testing of additional indicators” (“FFO-2015” 2016).

The demand for social metrics is only expected to consolidate in the years to come. To make progress, it is central for current efforts to recognize the setbacks and limitations of prior indicator movements. As Judith Innes and David Booher suggest in relation to these programs, “the ‘let a hundred indicators blossom’ approach has led to information overload and unfocused development,” hindering the potential of social sciences to influence policy (quoted in Wong 2003: 261). If trends continue, and a plethora of tools and metrics are created without consolidation or reconciliation with a common set of frameworks, we believe that current efforts will repeat past mistakes (McBain and Alsamawi 2014). In the next sections, this article approaches this issue and offers some consideration of current indicator frameworks, social intelligence, and evidence-based policy making (Davies 2012). It argues that the use of particular indicators predetermines the identification of problems and solutions, and constrains the types of interventions and decisions that local, state, and federal governments can make (Noll 2002). Therefore, more clarity is called upon for the identification of indicator frameworks and theories, as well as the mechanics behind their implementation, which underlie many of these apparently new monitoring systems. The process requires not only reconciling policy with measurement and theory (or indicators with indices, domains, and frameworks), but also a careful assessment of the policy and research contexts that give rise to quantification approaches.

Indicator Frameworks

We define the term “indicator” beyond its common use to represent a single measured variable that quantifies the state or quality of an attribute in the world (i.e., number of households on food stamps). It must be noted that indicators are not pure quantitative or qualitative raw data; their value is established in reference to a baseline or a target condition (Hák et al. 2012). This suggests that, without a theoretical framework to interpret it, an indicator by itself is of scarce use.

When several measured variables are integrated into a number or score such as the Gini coefficient or the gross domestic product (GDP), the resulting composite indicator is known as an index (Dillard et al. 2013). In turn, indices and indicators can be combined to represent a single semantic construct or domain (i.e., the domain of food security is composed of three indices including access to food, affordability, and quality, also by themselves an aggregation of individual indicators) (IFPRI 2015). While domains are frequently equated to dimensions within the terminology of measurement theory, dimensions can also represent the aggregation of several domains into a larger entity or concept. For example, the dimension of health is commonly subdivided into the domains of mental, physical, and material health, each of them an aggregate of indices and indicators (MEA 2005).

We consider an indicator framework to be the implementation of a theoretical concept(s) through a set of dimensions, domains, indices, or variables and their associations (Berger-Schmitt and Noll 2000; Noll 2002). During the process of implementation, also known as operationalization, choices are made regarding the mapping of concepts to empirical observations and the aggregation of different data sources (Hueting and Reijnders 2004). These choices are equivalent to the formulation of theoretical hypotheses about which events are important to measure and the types of relations that may exist between observations. In short, if carefully constructed, the theory behind an indicator framework explicitly articulates a particular understanding of the world. Most importantly, when motivated by a policy question, a framework can enable decision makers to formulate expectations about the behavior of an institution, a social system, or its agents, and to act on them (Hák et al. 2012).

As discussed in the previous section, one of the most fructiferous conceptual frameworks in the field of social indicators stems from the sustainable development movement of the early 1990s (UN-DESA 1992; Wilson et al. 2007). The goal of this approach was to assess the environmental sustainability of economic and social policies to better inform decision makers in all spheres of government and resource management. Through the continuous advocacy of the UN Environmental Program and transnational organizations such as the OECD or the European Commission, the movement of sustainability led to the creation of a plethora of different sets of indicators predominantly at the national level. Most frequently used instruments are the Ecological Footprint, the Human Development Index (HDI), the Ecological Well-Being Index (EWI), the Sustainable Development Index (SDI), the Environmental Vulnerability Index (EVI), and the Environmental Sustainability Index (ESI) (Dahl 2012). Initially, sustainability was defined as the ability to protect resources for future generations by “safeguarding the vital functions” of ecosystems and by fostering equitable economic growth and social progress (>Hicks et al. 2016; Hueting and Reijnders 2004: 249). Common metrics included a varied collection of economic, ecological, and social attributes such as social and economic capital, economic growth and performance, life quality measures, human well-being and development, impact assessment, and pollution and biodiversity indices (Hák et al. 2012). As the notion expanded to account for effects and responses to climate change, sustainability frameworks also evolved to deal with issues of vulnerability, risk, and resilience (UNEP 2015).

Building from C. S. Holling’s core definition of resilience as the capacity of a system to withstand change, the socioecological systems (SES) approach has become one of the dominant analytical theories in the study and assessment of human-environment interactions (Holling 1973, Berkes 2006; Ostrom 2009). In the field of social indicators, SES frameworks rely heavily on the development and application of indices to measure community resilience in terms of adaptability, risks, and vulnerability (S. Carpenter et al. 2001). How the latter is defined has been standardized in the vulnerability formula used by the Intergovernmental Panel for Climate Change (IPCC), which refers to the Vulnerability-Resilience Indicator Model (VRIM) (Yohe et al. 2006). The formula proposes the construction of a vulnerability index that combines the domains of sensitivity, exposure, and adaptive capacity comprised by the aggregation of several other indices and indicators such as GDP, income distribution, infrastructure, food security, and health. In all, SES vulnerability and resilience metrics emphasize the understanding of the critical conditions that can enhance or impair the ability of a community to respond to extreme events, climatic hazards, and natural or human-made disasters (A. Carpenter 2013; Cutter et al. 2009). The objective is to aid civil and government institutions in the mitigation of and preparedness for future environmental challenges. In addition to risk and disaster management, SES assessments have also approached issues related to the governance and management of natural resources central to sustainability perspectives (Walker et al. 2006). Examples of indicator sets are the Social Vulnerability Index (SoVI), Resilience Index Measurement and Analysis (RIMA), and the Resilience Alliance tools for sustainability assessment in environmental landscapes (SELPS) (Bergamini et al. 2013).

While there are important points of convergence between the sustainability and the SES frameworks, there are also differences. Scholars have observed that how terms are defined can vary, with sustainability approaches often equating the term “resilience” with “vulnerability” (Cutter et al. 2014). Additionally, SES indicator frameworks target measurements at finer levels both temporally and spatially, and place significant work in the development of metrics that can be both comparable and locally relevant (Eakin and Luers 2006). Finally, because there is strong evidence that stakeholder participation affects data quality (Smit and Wandel 2006), the SES approach has increasingly incorporated self-assessment tools such as the Coastal Community Resilience Index (CRI) (Sempier et al. 2010) and co-participatory mechanisms in the formulation of instruments (Biggs et al. 2012; Walker et al. 2006).

Aside from theoretical frameworks, the development of social indicators has also been strongly shaped by specific methodological and modeling techniques used to represent and evaluate human-environmental interactions within a system (Le Gentil and Mongruel 2015). The DPSIR framework and its different variants, for instance, represent complex socioecological situations by establishing causal relations between drivers, pressures, stressors/states, impacts, and responses (Gari et al. 2015; Kristensen 2004). Indicators become components of the model, with variables such as demographic attributes or transportation constituting drivers of the system. On the other hand, pressures affecting the integrity of the landscape are captured by measures of resource use, industry diversity or resource dependence, emissions, and changes in land use (Hou et al. 2014).

Another frequently used methodological technique is the Multi-Criteria Analysis (MCA), which allows researchers to better capture decisions and their impacts by capturing preferences in the choices of stakeholders. The method requires participants to evaluate different management scenarios through indicators and criteria (Sheppard and Meitner 2005), ultimately determining the best course for action given a context.

MCA, like stated preference, willingness to pay, and other valuation instruments, is part of a larger group of tools used by economic approaches to assess human-environment interactions. However, these techniques are also implemented in consonance with sustainability or SES theories and can be regularly employed during the course of impact assessment protocols to evaluate socioeconomic consequences of policies or decisions. Traditionally, economic perspectives have focused on the environment as an asset and an input into the production of goods and services, adopting either a descriptive (what is) or a normative (what ought to be) approach (Tietenberg 2004). At the core of the normative approach, especially as it relates to environmental policy, is the accounting for externalities generated from our interaction with the environment and the impacts resulting from the production process (Cropper and Oates 1992). Economists measuring these impacts would focus on studying phenomena that fall in two broad categories: markets (through observable prices or quantities) and nonmarket (by eliciting values and preferences). This has changed over the past couple of decades with the emergence of the ecosystem service approach (Boyd et al. 2015; Hou et al. 2014). Consequently, a concerted effort within economic and environmental fields has emerged to explicitly link ecological structures and functions with human well-being. New ecosystem service perspectives may help deal with limitations associated with current social indicator frameworks (Yoskowitz and Russell 2015). As the following sections will discuss, some of these shortcomings arise from the lack of integration of ecological, economic, and social dimensions in measurements, along with difficulties associated with the transition of information into decisions and policies.

Materials and Methods

A targeted or scoping review was conducted in order to explore the current federal government initiatives addressing the development of socioeconomic indicators in coastal resource use and management (Booth et al. 2012).

Research Objectives

The purpose of the literature review is both aggregative (directed at answering a particular research question) and interpretive (informing prior assertions) (Booth et al. 2012). In the first case, the review aims to assess whether current social indicator efforts recognize and overcome prior limitations in the explicit use of theoretical known frameworks within social indicators and advance the integration of policy. In the second case, the review helps to further document new insights on how the social indicator movement is changing by including the measurement of new phenomena. In short, the review identifies the major rationale underpinning indicator efforts in coastal and ocean resource management over the past 15 years and sheds some light on how vulnerability, resilience, and well-being are currently operationalized within a policy, theory, and research context.

Scoping Process

To investigate how socioeconomic indicators are defined; what domains, dimensions, and types of indicators are used; and the explicit linkages between theories and frameworks and the policy cycle, only white papers and publicly available reports were used. The assumption behind this decision was that white papers and reports, given their intended audience, are compelled to openly state the linkages among research, theory, policy, and resource management. On the contrary, information is presented within peer-reviewed articles in a format that usually refers to an academic audience and is not freely accessible. While this situation is rapidly changing with a stronger emphasis on the application of knowledge to real-life management scenarios, publication constraints on the length of manuscripts often prevent a full treatment of policy linkages. Furthermore, peer-reviewed articles are rarely the original source of data, building from information included in reports. Exceptions to these criteria were made when project reports were not readily available or publications placed a special emphasis on policy dimensions ignored in the source document.

Search and Selection Processes

The search for white papers and reports was restricted to documents published from 2000 to July 2016 by government agencies operating at the federal and state levels. The search was further narrowed to those institutions that had jurisdiction on watershed, ocean, and coastal resource management. These included the NOAA, a branch of the US Department of Commerce; the EPA; the US Forest Service (FS), of the US Department of Agriculture (USDA); and the Bureau of Ocean Energy and Management (BOEM), the US Geological Survey (USGS), the National Park Service (NPS), and the US Fish and Wildlife Service (FWS), all under the purview of the DOI. Finally, reports originating from agency contractors or projects funded by federal grants were also considered.

First, documents were compiled by visiting information portals, institutional websites, and online databases from each agency. Additionally, queries were run in Web of Science and Google Scholar online databases with the keywords “social indicators,” “environment,” and “report” to further supplement the search. The criteria for inclusion of documents were defined as follows: (1) explicit treatment of social indicators through the review of past efforts, the development of new metrics, or the implementation of metrics; (2) explicit use of quantitative and/or qualitative data sets; (3) no replication of information (information is original unless thematically integrated); and (4) the inclusion of metrics that were social or socioeconomic in nature and that referred to coastal or ocean resource management or coastal zone management.

In the second phase of the search and selection process, approximately 208 documents were filtered by reading abstracts and conclusions. Duplicate reports and non-US-related documents were removed, bringing the total number of items to 97. To prepare reports for statistical and content analyses, documents were added to an Excel spreadsheet. In the database, an entry was created for each item with the following columns: agency, title, full citation, year of publication, location, definition of indicator and intent of report, domains of indicators, list of indicators, characteristics of the target population (sample size, unit of analysis), limits, benefits, connections to policy implementation, and observations.

Analysis

The content analysis was restricted to documents that included indicators, indices, domains and dimensions, or indicator frameworks in order to elicit common themes in measured areas (Bernard 2006; Weller 2007). Content analysis allows for the semantic organization and classification of reports in order to identify relations to key research questions. After proper formatting, cleaning, and depuration of entries with Anthropac 4.98, basic summary statistics were derived to characterize the sample. Given the large number of unique indicators, indices, and domains, after carefully considering all the terms, three iterations were required to regroup synonym items and remove qualifying terms. The refining process was done by using the SOUNDEX function in Anthropac, which operates based on syllabic similarities. The coding was kept to the minimum possible and implied subsuming the smaller items into larger encompassing themes or constructs according to semantic overlap. This permitted the posterior analysis to be done at the domain or dimension level.

In the first step of coding, all items were listed and considered in their frequency. Then, single indicators, indices, and domains were merged into themes when they explicitly stated in the implementation its connection to that particular theme, or their unit of measurement coincided with or referred to the same theoretical construct. Strict criterion were adopted to merge items with the goal of preserving the variability in the operationalization of areas of measurement as well as to single out consistencies. For example, the general theme of well-being was used to subsume indicators or indices that directly referred to human well-being, individual well-being, personal well-being, and household well-being. However, we preserved distinctions when the items clearly referred to different areas or involved qualities related to different groups or types of population samples (national vs. local levels). We also kept themes classified separately when the items implemented known indicator frameworks with explicit domains and dimensions. For instance, in the case of resilience operationalized as vulnerability, sensitivity, and adaptive capacity, all three terms were kept as separate themes. A second reason we decided to carry out the analyses at the level of domains is that a smaller number of reports did not include the full set of single metrics used, only listing indices or areas of measurement. Once all codification processes were concluded, frequency, correlation, and similarity tests, cluster analysis, and nonmetric multidimensional scaling (MDS) were conducted with the software packages UCINET 6 and XLSTAT.

Study Limitations

Limitations of this study include the definition of criteria for the scoping, searching, and inclusion processes, as well as the coding of individual reports and indicators into larger themes. It should be observed that the list of reports might not be exhaustive, with additional revisions needed in the future to update results. For example, there may be documents that are not accessible online or have not been made publically available. This is the case of grant reports and embargoed studies. In addition, some of the projects reviewed had not been published in a technical memorandum or in a freely accessible capacity, but are included in chapters of books or journals not available to the authors. While these gaps in information may exist, this study covers a large proportion of the agency-generated literature. Thus, the inferences and conclusions proposed can be considered within the context of the larger sample of social indicator research, and not just by themselves (Le Gentil and Mongruel 2015). Finally, limitations in the treatment and cleaning of the data may affect results from this study. In the future, investigator biases will be reduced by implementing dual-coding processes that can ensure interrater reliability.

Results

Main Characteristics

The review process resulted in a population of 208 documents that included reports, white papers, and published articles. Of these, only 97 documents were analyzed after some reports were eliminated because of redundancy and lack of relevance to the research goals. The larger proportion of documents originated from the NOAA, the DOI, and the EPA (Figure 1). This result is not surprising given that these are the main agencies with jurisdiction over watershed, coastal, and ocean resources. About 20 percent of studies were published in peer-reviewed journals or in conference proceedings, a trend that seems to be increasing in the past few years. Studies are mostly located in the United States and its territories, with a significant concentration in the Gulf of Mexico (28 percent) and the Northeast (27 percent). Only 10 percent of studies had a national coverage.

Figure 1
Figure 1

Major Characteristics of Social Indicator Reports

Note: Main characteristics of literature review findings. A total of 204 documents were identified, of which 97 constituted reports. After cleaning for overlaps, the total number of reports analyzed was 88. The majority of reports originated from the NOAA, with an important increase in the number of documents in 2011.

Citation: Environment and Society 8, 1; 10.3167/ares.2017.080102

Type of Indicator Domains or Themes

The number of reports analyzed was 88, with approximately 663 indicators, indices, and/or domains or areas of measurement mentioned (see Appendix for a list of reports). After recodification and data cleaning, the number of different nonoverlapping constructs or themes approached 204. On average, each report contained 7.5 themes. The frequency in which a theme was mentioned was graphed with a scree plot to decide on a cut for similarity and correlation analyses (Figure 2). Based on the literature, the value was restricted to four—that is, the same theme was mentioned in at least four different reports—which resulted in a list of 52 items (Borgatti and Carboni 2007). While a high proportion of reports indicated as a goal the measurement of well-being, resilience, vulnerability, and sustainability, the most frequently mentioned themes were demographic attributes, fishing and coastal resource dependence, and governance (Table 1). Well-being as a single metric or domain of measurement was only mentioned in 8 cases, resilience in 7, and vulnerability in 14. The reason behind the lack of use of these themes as indicators per se should not be interpreted as lack of measurement. On the contrary, it is

Figure 2
Figure 2

Scree Plot of Unique Indicators

Note: This graph displays the frequency in which unique indicators are mentioned. Observe the different “elbows” or discontinuities marked by the arrows in the curve. The fourth discontinuity is used as the criteria for content and similarity analyses.

Citation: Environment and Society 8, 1; 10.3167/ares.2017.080102

suggestive of a large set of alternatives in how each construct is operationalized to reflect equivalent dimensions of measurement.

Similarity

To analyze similarity and correlations, lists of indicator themes were transformed into a 52-by-52 matrix. Similarity was computed by evaluating the co-occurrence of pairs of themes (presence or absence of positive matches) in the 88 reports. High similarity values were found for themes associated with fishing activities and infrastructure (0.85), participation and awareness (0.83), fishermen characteristics and investment (0.80), costs and investment (0.80), education and environmental capital (0.56), governance and economic capital (0.55), governance and environmental capital (0.55), and poverty and environmental risks (0.67). The analysis of correlations, which considered the order in which items were mentioned, found a high association among domains or themes related to fishing or coastal activities. Additionally, high correlation values were observed between governance and economic capital (0.66); governance and environmental capital (0.66); employment and poverty (0.77); business and transportation (0.77); and subsistence and ethnicity (0.73). In the case of health, correlations with other theme metrics were very low, in most cases negative, indicating variety on how this domain is operationalized in the different sets. Positive correlations were seen between health and education (0.30), health and culture and well-being (0.37), and health and general socioeconomic conditions (0.37). High correlation values were also observed with environmental justice (0.50), regulation (0.50), and safety (0.40). Like health, vulnerability showed low correlation values in general, with only a high association with adaptive capacity (0.40), sensitivity (0.31), and exposure (0.37).

Table 1The 20 Most Frequent Indicators
IndicatorFrequency
Note: This table presents the most frequent indicators mentioned in federal agency reports between 2000 and July 2016.
Demographics24
Fishing and coastal resource dependence20
Governance18
Fishing infrastructure and characteristics17
Housing infrastructure16
Economic capital16
Employment16
Social capital16
Education15
Health14
Environmental capital13
Vulnerability13
Subsistence11
Business10
Environmental risks and hazards10
Poverty10
Transportation10
Ethnicity10
Resource quality9
Culture and cultural well-being9
Socioeconomic conditions9
History8
Infrastructure capital8
Recreation fishing8
Well-being8
Fishing subsidies7
Safety7
Resource use7
Resilience7
Adaptive capacity6

A nonmetric multidimensional scaling (MDS) graph, which represents the similarity between items on a two-dimensional array, was used to identify potential clustering areas in the selection of indicator themes (Figure 3). It should be noted that proximity of the items on the map represents how often those themes were mentioned together in the reports. The x and y axes represent different clustering dimensions (they may be called larger theoretical constructs or frameworks) that could explain the proximities and distances between terms. Furthermore, it should be observed that items closer to the center are those most frequently mentioned in all reports.

Figure 3
Figure 3

Multidimensional Scaling Plot of 52 Most Mentioned Indicators

Note: This figure presents a two-dimensional spatial analysis of similarities among reports. Each point represents one of the 52 most mentioned indicators. The distance between points reflects similarity among the indicators, with points closer together representing items that are often included together in indicator frameworks. The stress level for the MDS was 0.0086.

Citation: Environment and Society 8, 1; 10.3167/ares.2017.080102

In this case, the distribution of the points in the coordinate space is scattered in the shape of a cloud. This makes the discernment of potential dimensions difficult. However, some points exhibit small clustering. For example, aggregation can be observed close to the origin for themes referring to economic conditions, economic capital, resource use dependence, environmental risk, infrastructure, and demographics, among others. This particular aggregation is reminiscent of the DPSIR framework or may refer to indicators that deal with the direct measurement of environmental impacts and economic vulnerability. Another potential clustering may be represented by the measurement of adaptive capacity, exposure, vulnerability, and sensitivity, which suggests the vulnerability formula within the SES framework. On the left top quadrant of the graph, the clustering may respond to well-being, safety, health, and resilience. Finally, on the top mid-to-right quadrant, the clustering seems to indicate some analysis of governance and organizational issues.

To better understand the connection between indicator sets and theoretical frameworks, the structure of the data was further studied through cluster analysis. The analysis suggested the existence of at least four or five parental structures. These aggregations are represented by the main branches of a dendrogram tree (Figure 4) and may be interpreted as different indicator classes or potential theoretical frameworks. Structures had an unequal number of items, hinting at different degrees of variation in how each of these larger classes is operationalized through

Figure 4
Figure 4

Cluster Analysis of Indicators

Note: Dendrogram displaying the clustering of 52 indicators. The different branches represent structures around which indicators cluster according to their similarity. Observe that there is an unequal distribution in the number of indicators in these structures. This may respond to the differences in how some constructs are operationalized in the reports. In addition, parenthetic structures may show some overlap with the theoretical frameworks discussed previously.

Citation: Environment and Society 8, 1; 10.3167/ares.2017.080102

metrics. For example, given that the focus of the review is on watershed, coastal, and ocean resource management, one of the main classes of indicators only includes two items, referring to land cover and changes in land use and the exploitation of the environment. A second class, the largest structure in the dendrogram, includes demographics, fishing and coastal resource dependence, fishing and coastal infrastructure, housing, economic indicators, employment, recreation, subsistence, governance, and several others. The presence of this item once again is suggestive of a DPSIR type of approach, with some elements of an SES framework related to risks, disasters, and hazards. A third analytical framework is the cluster including costs, expenditures, market participation, level of investment, and fishermen characteristics, which seems to underlie most classic economic analysis. The fourth and fifth clusters refer, respectively, to empowerment, well-being, inequality, and resilience; and participation, awareness, resource use, conservation and restoration, environmental justice, and leadership. The selection of these domains or dimensions can be interpreted as evoking predominantly a sustainability approach, with some relations to SES in terms of its focus on governance and participation. The sixth cluster can be analyzed as implementing the vulnerability formula, as it includes vulnerability, sensitivity, adaptive capacity, and exposure.

Research-Policy Connections

A major objective of this article was to determine whether current social indicator efforts recognize and overcome prior limitations in the use of social metrics and advance the integration of science and policy. To that end, reports were analyzed in terms of how indicators were defined and whether there was a relation to larger indicator frameworks. Then, the analysis considered the goals and scopes of the document, potential deliverables, and the mention of limitations.

Definitions

The definitions of indicators among reports varied over time and according to stated objectives. For example, in the first part of the 2000s, documents explored the identification of impacts (Hall-Arber et al. 2002; Jepson et al. 2002), the creation of baselines and community profiles (Aratame and Singelmann 2002; Luke et al. 2002), indices of resource dependence (Impact Assessment 2004, 2006), and sustainability (FS 2004). As illustrated in the “Socioeconomic Manual for Coral Reef Management,” indicators were considered as “a way to learn about the social, cultural, economic and political conditions of individuals, groups, communities and organisations” (Bunce et al. 2000: 2). By 2005, community profiles and impact assessments had increased in frequency (Allen and Bartram 2008; Griffith et al. 2007; Norman et al. 2007; Sepez et al. 2005). The 2000 to 2005 period may indicate, then, the implementation of sustainability or impact assessment approaches to the development of indicators.

In the second half of the 2000s, vulnerability assessments became more prevalent along with measures of resilience and the integrated multidisciplinary assessment of conditions, pointing to the adoption of SES vulnerability and risk frameworks (Alessa et al. 2008; Barnett et al. 2008; Clay and Olson 2008; Cutter and Emrich 2006). Definitions presented indicators as “any variable that characterizes the level of vulnerability-resilience to a community in a watershed” (Alessa et al. 2008: 528).

Between 2010 and 2016, the number of documents increased, with a concentration on issues about well-being (Andrews and Withey 2012; Biedenweg et al. 2014; L. Smith et al. 2013, 2015; S. Smith et al. 2011) and resilience (Bridges et al. 2013; Sempier et al. 2010), along with a special focus on tracking changes and the causes and effects of climate change (Dillard et al. 2013; Huang and Barzyk 2015; Lovelace and Dillard 2012; Lovelace et al. 2013; Pollnac et al. 2015). Within this context, the focus on indicators has shifted to the provision of “relevant information for stakeholder decisions on climate resiliency and on the efficacy of resiliency measures to reduce vulnerability and risk” (Solecki et al. 2015: 83), marking a consolidation of SES approaches within federal institutions. Focusing on adaptation, indicators became metrics that can “[tell] us that something is changing or has made a change” (Lovelace et al. 2013).

Goals, Scopes, and Deliverables

To analyze the documents’ goals, a classification of reports was done based on their functional purpose (Bowen and Riley 2003). We found that many of the reports targeted specific informational gaps and perceived limitations in knowledge about socioeconomic systems. These gaps responded to a lack of effective measures of the capacity of a community to manage and plan (Donatuto et al. 2014; Newman et al. 2002), or the quantification of complex dimensions of resource use not related to consumptive activities such as aesthetic and spiritual values (Boyd et al. 2015; Johns et al. 2014; Sherrouse et al. 2014; Villamagna et al. 2014). Other informational needs originated in the measurement of resource dependence and the significance of resources in terms of subsistence (not commercial or recreational value) (Breslow et al. 2013; Luton 2013; Jacob et al. 2010; Jacob and Jepson 2009).

A second group of reports were predominantly concerned with issues about regulatory compliance and the assessment of impacts. These documents aimed at providing context when considering the trade-offs between different policy actions (Campbell 2011; Dismukes et al. 2003; Fleming et al. 2014; Petterson et al. 2008; Reedy-Maschner and Maschner 2012). Closely connected to the assessment of impacts of regulations is the scoping of damage from environmental and technological hazards (Austin et al. 2014; Colburn et al. 2015; Impact Assessment 2006) and from exposure to climate change risks (Bridges et al. 2013; Dillard et al. 2013; Himes-Cornell and Kasperski 2015; Huang and Barzyk 2015).

A third group of reports had the goal of informing strategies for future restoration, mitigation, or recovery actions (Carriger et al. 2015; Chang et al. 2013; EPA 2016). Reports also included objectives related to the prioritization of strategies for climate change adaptation, preparedness, and planning (Colburn et al. 2016; Ekstrom 2015; Kenney et al. 2013; Sempier et al. 2010; Summers et al. 2016).

The final group of reports had goals pertaining to performance evaluation and the assessment of the success of programmatic actions (Clay et al. 2014; Heinz Center 2003). It is important to observe that although not all of the documents proposed performance measures, a significant proportion of reports directly addressed policy implementation concerns. Among these documents were white papers that targeted action plans (Clay et al. 2014; Fleming et al. 2014; Jacob et al. 2013; Newman et al. 2002). In the next section, the link between indicator efforts and policies will be further explored.

Discussion

Increased government concern with vulnerability, resilience, and adaptation of coastal communities has created the context for the reemergence of a new wave of social indicators. The use of metrics for quantifying socioeconomic phenomena has found support among policy makers and federal agencies. Proponents of indicators envision statistics as providing critical intelligence in the development of climate change adaptation responses and preparedness measures. The interest in social metrics has crystalized in the work of interagency groups, new indicator handbooks and standards, and the synthesis and consolidation of information in large online repositories. This article investigates the main rationale underlying this effort and explores the implementation of these new measures in policy contexts. To that end, results from a content analysis of 88 published agency reports between 2000 and June 2016 are discussed. Main findings can be summarized in the following points.

While the analysis showed some moderate convergence on which dimensions should be measured (i.e., vulnerability, well-being, and resilience) and on the major indicator frameworks used, there is a high level of heterogeneity in the number and types of metrics present across the reports, which suggests significant variability in the implementation of common theoretical notions. The exploration of how the term “indicator” was defined across documents also showed variation that is symptomatic with a change in policy interests, and subsequently, in research concentration. In the early 2000s, metrics were characterized as providing baseline or contextual information in the context of decision making and regulation, with a stronger mention of sustainability. More recent definitions, however, refer to improving the information of communities in order to enhance climate change adaptation responses and well-being, recalling SES frameworks for disaster and risk prevention, as well as new ecosystem services approaches. This shift in priorities in what is measured reflects the political repercussions of events like Hurricane Sandy and the Deepwater Horizon oil spill. It captures the urgency in developing effective risk communication mechanisms and preparedness measures within federal agencies.

The study of goals and objectives in many of these reports also offers a perspective on how indicators connect to policy and implementation. In contrast to previous indicator movements, the identification of an information gap that originates in policy requirements or management actions is the starting point for the scoping process in a large proportion of these white papers. There is a direct intention to answer policy and decision-making questions, and some of the documents even target programmatic objectives at the agency level and the inclusion of indicators in adaptive management cycles. Thus, advances are made in the institutionalization of policy issues and the explicit inclusion of monitoring systems within federal agencies. A good example of the further integration of research with policy is the National Climate and Health Assessment processes led by the US Global Change Research Program.

Despite the value of these efforts, there is still a significant distance to cover before policy processes can meet the complex reality that characterizes environmental planning (Innes and Booher 2010). For example, and as underlined in recent publications, discrepancies remain between stated policy goals, social metrics, and the actual use of information (Hicks et al. 2016). A way to facilitate integration implies targeting metrics to broader audiences beyond decision makers and ensuring co-participatory processes in the development of indicators (Kenney et al. 2013). Another critical point lies in the collaboration and consolidation of frameworks, measurement philosophies, and tools across multiple government and nongovernment institutions.

The creation of interagency working groups in topics like resilience, ecosystem services, and well-being symbolizes a substantial effort in that direction. But, the number of independent frameworks that, while similar to a larger theoretical field, present modifications from established models and the underrepresentation of social sciences in many of these agencies represent important constraints to consolidation (Innes and Booher 2000; Wilson et al. 2007). Out of 150 resilience projects funded through the Sandy Supplemental, only 24 included social science metrics (Abt Associates 2015). This situation markedly contrasts with the highly collaborative and dynamic planning scenarios that communities, nongovernmental institutions, and transnational organizations are adopting across different environmental and multidisciplinary landscapes (Clarke et al. 2013; Feurt 2006; Innes and Booher 2010).

A final element to take into consideration is the constitutive role that indicators can have in illuminating or obscuring issues by selectively focusing attention (Hicks et al. 2016; MacDowall et al. 2016). At least two major precautions derive from this observation. First, it is critical that decision makers and resource managers recognize that a choice of a particular set of indicators predetermines the identification of problems and solutions, and constrains potential interventions and actions. If the current indicator effort is to succeed, the program needs to move beyond the mere identification of problems to the delineation of inclusive solutions. This requires the combination of bottom-up and top-down mechanisms that can secure the flow of information and the consideration of all different voices.

The second precaution that needs to be acknowledged is that numbers alone do not tell the whole story (Noll 2002). Whereas the proliferation of statistical approaches has influenced how governments think about information and what constitutes data, evidence-based policies and governance practices can still operate by considering other kinds of evidence beyond quantification (Davies 2012). Systems of traditional ecological knowledge, scenario-planning methodologies, qualitative forecasting methods, and experiential knowledge are but a small set of tools that should be used to improve the reliability and ground-truthing of indicators (Hicks et al. 2016; Martínez and Dopheide 2014).

Much has changed since the program of social indicators was first introduced in the 1960s. New challenges have emerged, and with them, our vision about the future has radically altered. The recognition that socioecological environments are highly complex, and that our knowledge is limited by uncertainty and unpredictability has made many of the old assumptions about measurement and metrics obsolete. In this context where requests for actionable information are increasing at rapid pace, social indicators can become valuable aids. A centralization and leveraging of efforts, realistic expectations, and end-to-end integration with policy cycles are some of the issues that the current program needs to face to secure its continuation.

Conclusion

Over the past 50 years, there have been three waves of social indicator development. The most recent movement has been driven by several focusing events, including Hurricanes Sandy, Ike, and Katrina and the Deepwater Horizon oil spill. While institutions are able to respond to technical and engineering needs following a disaster, important social and economic information is missing that would make response and recovery decisions more effective. This critical gap has led several federal interagency groups in the United States to undertake the task of embedding social, behavioral, and economic tools into policy and governance.

This article described the examination and initial content analysis of more than 200 documents and the detailed analysis of indicator domains and themes of 88 documents published between January 2000 and June 2016. The main findings are:

  1. (1)A high level of heterogeneity exists in the number and types of metrics present across the reports.
  2. (2)A large number of independent frameworks and the underrepresentation of social sciences in many of these agencies represent important constraints to consolidation.
  3. (3)How the term “indicator” is defined has varied over time and is symptomatic with a change in policy interests, and subsequently, in research concentration.
  4. (4)Advances have been made in the institutionalization of policy issues and the explicit inclusion of indicators.
  5. (5)There is urgency in developing effective risk communication mechanisms and preparedness measures within federal agencies, and indicators are playing an important role.

The federal agency family, and others, would benefit from a concerted effort to develop consistency in frameworks and indicators, thus easing the cost of adoption. This must be done carefully, as the selection of indicators and domains would predetermine the problems that need to be addressed and may constrain what actions can be taken. If this third wave of the indicator movement is to succeed, programs must move beyond the mere identification of problems to the delineation of holistic solutions.

REFERENCES

  • AndrewsFrank M. 1989. “The Evolution of a Movement.” Journal of Public Policy 9 (4): 401405. doi:10.1017/S0143814X00008242.

  • AndrewsFrank M. and Stephen B. Withey. 2012. Social Indicators of Well-Being: Americans’ Perceptions of Life Quality. Berlin: Springer Science and Business Media.

    • Search Google Scholar
    • Export Citation
  • AtkinsonTonyBea CantillonEric Marlier and Brian Nolan. 2002. Social Indicators: The EU and Social Inclusion. Oxford: Oxford University Press.

    • Search Google Scholar
    • Export Citation
  • AustinDianeBrian MarksKelly AmesTom McGuireBen McMahanVictoria PhaneufPreetam PrakashBethany RogersCarolyn Ware and Justina Whalen. 2014. Offshore Oil and Deepwater Horizon: Social Effects on Gulf Coast Communities—Volume I: Methodology Timeline Context and Communities. OCS Study BOEM 2014-617. New Orleans, LA: BOEM Gulf of Mexico OCS Region. http://www.data.boem.gov/PI/PDFImages/ESPIS/5/5384.pdf.

    • Search Google Scholar
    • Export Citation
  • BadhamMarnie. 2009. “Cultural Indicators: Tools for Community Engagement?International Journal of the Arts in Society 3 (5): 6776.

    • Search Google Scholar
    • Export Citation
  • BarnettJonSimon Lambert and Ian Fry. 2008. “The Hazards of Indicators: Insights from the Environmental Vulnerability Index.” Annals of the Association of American Geographers 98 (1): 102119. doi:10.1080/00045600701734315.

    • Search Google Scholar
    • Export Citation
  • BauerRaymond A. 1966. Social Indicators. Cambridge, MA: MIT Press.

  • BergaminiNadiaRobert BlasiakPablo EyzaguirreKaoru IchikawaDunja MijatovicFumiko NakaoSuneetha M. Subramanian. 2013. Indicators of Resilience in Socio-ecological Production Landscapes (SEPLs). Yokohama: United Nations University Institute of Advanced Studies.

    • Search Google Scholar
    • Export Citation
  • Berger-SchmittRegina and Heinz-Herbert Noll. 2000. Conceptual Framework and Structure of a European System of Social Indicators. EuReporting Working Paper No. 9. Mannheim: Centre for Survey Research and Methodology (ZUMA). http://www.gesis.org/fileadmin/upload/dienstleistung/daten/soz_indikatoren/eusi/paper9.pdf.

    • Search Google Scholar
    • Export Citation
  • BerkesFikret. 2006. “From Community-based Resource Management to Complex Systems: The Scale Issue and Marine Commons.” Ecology and Society 11 (1): art. 45. doi:10.5751/ES-01431-110145.

    • Search Google Scholar
    • Export Citation
  • BernardH. Russell. 2006. Research Methods in Anthropology: Qualitative and Quantitative Approaches. 4th ed. Lanham, MD: AltaMira Press.

    • Search Google Scholar
    • Export Citation
  • BiggsReinetteMaja SchlüterDuan BiggsErin L. BohenskyShauna Burn SilverGeorgina CundillVasilis Dakos et al. 2012. “Toward Principles for Enhancing the Resilience of Ecosystem Services.” Annual Review of Environment and Resources 37: 421448. doi:10.1146/annurev-environ-051211-123836.

    • Search Google Scholar
    • Export Citation
  • BoothAndrewDiana Papaioannou and Anthea Sutton. 2012. Systematic Approaches to a Successful Literature Review. Los Angeles: Sage.

  • BorgattiStephen P. and Inga Carboni. 2007. “On Measuring Individual Knowledge in Organizations.” Organizational Research Methods 10 (3): 449462. doi:10.1177/1094428107300228.

    • Search Google Scholar
    • Export Citation
  • BowenRobert E. and Cory Riley. 2003. “Socio-economic Indicators and Integrated Coastal Management.” Ocean and Coastal Management 46 (3–4): 299312. doi:10.1016/S0964-5691(03)00008-5.

    • Search Google Scholar
    • Export Citation
  • BridgesToddCharley ChesnuttRoselle HennPaul WagnerCamley WaltersTy Wamsley and Kate White. 2013. Coastal Risk Reduction and Resilience. New York: US Army Corps of Engineers. http://www.swg.usace.army.mil/Portals/26/docs/PAO/Coastal.pdf.

    • Search Google Scholar
    • Export Citation
  • BrownBrett V. and Thomas Corbett. 1997. Social Indicators and Public Policy in the Age of Devolution. Special Report no. 71. Madison: Institute for Research on Poverty, University of Wisconsin–Madison. http://www.irp.wisc.edu/publications/sr/pdfs/sr71.pdf.

    • Search Google Scholar
    • Export Citation
  • CarpenterAnn. 2013. Social Ties Space and Resilience: Literature Review of Community Resilience to Disasters and Constituent Social and Built Environmental Factors. Community and Economic Development Discussion Paper No. 02-13. Atlanta: Federal Reserve Bank of Atlanta. https://www.frbatlanta.org/-/media/documents/community-development/publications/discussion-papers/2013/02-literature-review-of-community-resilience-to-disasters-2013-09-25.pdf.

    • Search Google Scholar
    • Export Citation
  • CarpenterSteveBrian WalkerJ. Marty Anderies and Nick Abel. 2001. “From Metaphor to Measurement: Resilience of What to What?Ecosystems 4: 765781. doi:10.1007/s10021-001-0045-9.

    • Search Google Scholar
    • Export Citation
  • CarrigerJohn F.Stephen J. JordanJanis C. Kurtz and William H. Benson. 2015. “Identifying Evaluation Considerations for the Recovery and Restoration from the 2010 Gulf of Mexico Oil Spill: An Initial Appraisal of Stakeholder Concerns and Values.” Integrated Environmental Assessment and Management 11 (3): 502513. doi:10.1002/ieam.1615.

    • Search Google Scholar
    • Export Citation
  • ClarkeBeverleyLaura StockerBrian CoffeyPeat LeithNick HarveyClaudia BaldwinTom Baxter et al. 2013. “Enhancing the Knowledge-Governance Interface: Coasts, Climate and Collaboration.” Ocean and Coastal Management 86: 8899. doi:10.1016/j.ocecoaman.2013.02.009.

    • Search Google Scholar
    • Export Citation
  • ClayPatricia M.Andrew Kitts and Patricia Pinto da Silva. 2014. “Measuring the Social and Economic Performance of Catch Share Programs: Definition of Metrics and Application to the US Northeast Region Groundfish Fishery.” Marine Policy 44: 2736. doi:10.1016/j.marpol.2013.08.009.

    • Search Google Scholar
    • Export Citation
  • ClayPatricia M. and Julia Olson. 2008. “Defining ‘Fishing Communities’: Vulnerability and the Magnuson-Stevens Fishery Conservation and Management Act.” Human Ecology Review 15 (2): 143160.

    • Search Google Scholar
    • Export Citation
  • CobbClifford W. and Craig Rixford. 1998. Lessons Learned from the History of Social Indicators. Vol. 1. San Francisco: Redefining Progress.

    • Search Google Scholar
    • Export Citation
  • CohenWilbur J. 1969. Toward a Social Report. ERIC no. ED054039. Washington, DC: Department of Health, Education, and Welfare. http://eric.ed.gov/?id=ED054039.

    • Search Google Scholar
    • Export Citation
  • CropperMaureen L. and Wallace E. Oates. 1992. “Environmental Economics: A Survey.” Journal of Economic Literature 30 (2): 675740.

    • Search Google Scholar
    • Export Citation
  • CutterSusan L. and Christopher T. Emrich. 2006. “Moral Hazard, Social Catastrophe: The Changing Face of Vulnerability along the Hurricane Coasts.” The ANNALS of the American Academy of Political and Social Science 604 (1): 102112. doi:10.1177/0002716205285515.

    • Search Google Scholar
    • Export Citation
  • CutterSusan L.Christopher T. EmrichJennifer J. Webb and Daniel Morath. 2009. Social Vulnerability to Climate Variability Hazards: A Review of the Literature. Final Report to Oxfam America. Columbia: Hazards and Vulnerability Research Institute, University of South Carolina.

    • Search Google Scholar
    • Export Citation
  • DahlArthur Lyon. 2012. “Achievements and Gaps in Indicators for Sustainability.” Ecological Indicators 17: 1419. doi:10.1016/j.ecolind.2011.04.032.

    • Search Google Scholar
    • Export Citation
  • DaviesPhilip. 2012. “The State of Evidence-based Policy Evaluation and Its Role in Policy Formation.” National Institute Economic Review 219 (1): R41R52. doi:10.1177/002795011221900105.

    • Search Google Scholar
    • Export Citation
  • EakinHallie and Amy Lynd Luers. 2006. “Assessing the Vulnerability of Social-Environmental Systems.” Annual Review of Environment and Resources 31: 365394. doi:10.1146/annurev.energy.30.050504.144352.

    • Search Google Scholar
    • Export Citation
  • FerrissAbbott L. 1979. “The U.S. Federal Effort in Developing Social Indicators.” Social Indicators Research 6 (2): 129152.

  • FeurtChristine. 2006. Cultural Models: A Tool for Enhancing Communication and Collaboration in Coastal Resources Management. NOAA Grant No. NA03NOS4190195. Wells, ME: Wells National Estuarine Research Reserve. http://www.wellsreserve.org/writable/files/microsoft_word_-_cultural_models_primer.pdf.

    • Search Google Scholar
    • Export Citation
  • “FFO-2015.” 2016. NOAA RESTORE Act Science Program. https://restoreactscienceprogram.noaa.gov/funding/ffo-2015 (accessed 1 September 2016).

    • Search Google Scholar
    • Export Citation
  • ForceJo Ellen and Gary E. Machlis. 1997. “The Human Ecosystem Part II: Social Indicators in Ecosystem Management.” Society and Natural Resources 10 (4): 369382. doi:10.1080/08941929709381035.

    • Search Google Scholar
    • Export Citation
  • FoucaultMichel. 2007. Security Territory Population: Lectures at the College De France 1977–1978. Basingstoke: Palgrave Macmillan.

    • Search Google Scholar
    • Export Citation
  • GariSirak R.Alice Newton and John D. Icely. 2015. “A Review of the Application and Evolution of the DPSIR Framework with an Emphasis on Coastal Social-Ecological Systems.” Ocean and Coastal Management 103: 6377. doi:10.1016/j.ocecoaman.2014.11.013.

    • Search Google Scholar
    • Export Citation
  • GergenKenneth J. 1973. “Social Psychology as History.” Journal of Personality and Social Psychology 26 (2): 309320.

  • GreenMaria. 2001. “What We Talk about When We Talk about Indicators: Current Approaches to Human Rights Measurement.” Human Rights Quarterly 23 (4): 10621097. doi:10.1353/hrq.2001.0054.

    • Search Google Scholar
    • Export Citation
  • HackingIan. 1990. The Taming of Chance. Cambridge: Cambridge University Press.

  • HákTomásBedrich Moldan and Arthur Lyon Dahl. 2012. Sustainability Indicators: A Scientific Assessment. Washington, DC: Island Press.

    • Search Google Scholar
    • Export Citation
  • HollingCrawford S. 1973. “Resilience and Stability of Ecological Systems.” Annual Review of Ecology and Systematics 4: 123.

  • HouYingShudong ZhouBenjamin Burkhard and Felix Müller. 2014. “Socioeconomic Influences on Biodiversity, Ecosystem Services and Human Well-Being: A Quantitative Application of the DPSIR Model in Jiangsu, China.” Science of the Total Environment 490: 10121028. doi:10.1016/j.scitotenv.2014.05.071.

    • Search Google Scholar
    • Export Citation
  • HuetingRoefie and L. Reijnders. 2004. “Broad Sustainability Contra Sustainability: The Proper Construction of Sustainability Indicators.” Ecological Economics 50 (3–4): 249260. doi:10.1016/j.ecolecon.2004.03.031.

    • Search Google Scholar
    • Export Citation
  • IFPRI (International Food Policy Research Institute). 2015. 2014–2015 Global Food Policy Report. Washington, DC: IFPRI. doi:10.2499/9780896295759.

    • Search Google Scholar
    • Export Citation
  • InnesJudith Eleanor. 1975. Social Indicators and Public Policy: Interactive Processes of Design and Application. Amsterdam: Elsevier.

  • InnesJudith Eleanor. 1989. “Disappointments and Legacies of Social Indicators.” Journal of Public Policy 9 (4): 429432. doi:10.1017/S0143814X00008291.

    • Search Google Scholar
    • Export Citation
  • InnesJudith Eleanor and David E. Booher. 2000. “Indicators for Sustainable Communities: A Strategy Building on Complexity Theory and Distributed Intelligence.” Planning Theory and Practice 1 (2): 173186. doi:10.1080/14649350020008378.

    • Search Google Scholar
    • Export Citation
  • InnesJudith Eleanor and David E. Booher. 2010. Planning with Complexity: An Introduction to Collaborative Rationality for Public Policy. London: Routledge.

    • Search Google Scholar
    • Export Citation
  • JacobSteve and Michael Jepson. 2009. “Creating a Community Context for the Fishery Stock Sustainability Index.” Fisheries 34 (5): 228231. doi:10.1577/1548-8446-34.5.228.

    • Search Google Scholar
    • Export Citation
  • JohnsGraceDonna J. LeeVernon LeeworthyJoseph Boyer and William Nuttle. 2014. “Developing Economic Indices to Assess the Human Dimensions of the South Florida Coastal Marine Ecosystem Services.” Ecological Indicators 44: 6980. doi:10.1016/j.ecolind.2014.04.014.

    • Search Google Scholar
    • Export Citation
  • KristensenPeter. 2004. “The DPSIR Framework.” Paper presented at a United Nations Environment Programme workshopNairobi, Kenya27–29 September. http://wwz.ifremer.fr/dce_eng/content/download/69291/913220/file/DPSIR.pdf.

    • Export Citation
  • LandKenneth C.Alex C. Michalos and Joseph Sirgy eds.. 2011. Handbook of Social Indicators and Quality of Life Research. Berlin: Springer Science and Business Media.

    • Search Google Scholar
    • Export Citation
  • LandKenneth C. and Seymour Spilerman eds. 1975. Social Indicator Models. New York: Russell Sage Foundation.

  • Le GentilEric and Rémi Mongruel. 2015. “A Systematic Review of Socio-Economic Assessments in Support of Coastal Zone Management (1992–2011).” Journal of Environmental Management 149: 8596. doi:10.1016/j.jenvman.2014.10.018.

    • Search Google Scholar
    • Export Citation
  • MacDowallLachlanMarnie BadhamEmma Blomkamp and Kim Dunphy eds. 2016. Making Culture Count: The Politics of Cultural Measurement. Berlin: Springer.

    • Search Google Scholar
    • Export Citation
  • MaloneyJohn C. 1968. “Review of Review of Social Indicators, by Raymond A. Bauer.” Journal of Business 41 (1): 115118.

  • MartínezJavier and Emile Dopheide. 2014. “Indicators: From Counting to Communicating.” Journal for Education in the Built Environment 9 (1): 119. doi:10.11120/jebe.2014.00009.

    • Search Google Scholar
    • Export Citation
  • McBainDarian and Ali Alsamawi. 2014. “Quantitative Accounting for Social Economic Indicators.” Natural Resources Forum 38 (3): 193202. doi:10.1111/1477-8947.12044.

    • Search Google Scholar
    • Export Citation
  • MEA (Millennium Ecosystem Assessment). 2005. Ecosystems and Human Well-Being: Synthesis. Washington, DC: Island Press. http://www.millenniumassessment.org/documents/document.356.aspx.pdf.

    • Search Google Scholar
    • Export Citation
  • NollHeinz-Herbert. 2002. “Towards a European System of Social Indicators: Theoretical Framework and System Architecture.” Social Indicators Research 58 (1–3): 4787.

    • Search Google Scholar
    • Export Citation
  • NollHeinz-Herbert. 2004. “Social Indicators and Quality of Life Research: Background, Achievements and Current Trends.” In Advances in Sociological Knowledge: Over Half a Century ed. Nikolai Genov151181. Berlin: Springer.

    • Search Google Scholar
    • Export Citation
  • OstromElinor. 2009. “A General Framework for Analyzing Sustainability of Social-Ecological Systems.” Science 325 (5939): 419422. doi:10.1126/science.1172133.

    • Search Google Scholar
    • Export Citation
  • PettersonJohn S.Edward GlazierLaura D. StanleyCarson MenckenKarl EschbachPatrick Moore and Pamela Goode. 2008. Benefits and Burdens of OCS Activities on States Labor Market Areas Coastal Counties and Selected Communities. OCS Study MMS 2008-052. New Orleans, LA: BOEM Gulf of Mexico OCS Region. https://www.boem.gov/ESPIS/4/4537.pdf.

    • Search Google Scholar
    • Export Citation
  • PollnacRichard B.Susan Abbott-JamiesonCourtney SmithMarc L. MillerPatricia M. Clay and Bryan Oles. 2006. “Toward a Model for Fisheries Social Impact Assessment.” Marine Fisheries Review 68 (1–4): 118.

    • Search Google Scholar
    • Export Citation
  • SawickiDavid S. 2002. “Improving Community Indicator Systems: Injecting More Social Science into the Folk Movement.” Planning Theory and Practice 3 (1): 1332. doi:10.1080/14649350220117780.

    • Search Google Scholar
    • Export Citation
  • ScottJames C. 1998. Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed. New Haven, CT: Yale University Press.

    • Search Google Scholar
    • Export Citation
  • SheldonEleanor B. and Howard E. Freeman. 1970. “Notes on Social Indicators: Promises and Potential.” Policy Sciences 1 (1): 97111. doi:10.1007/BF00145195.

    • Search Google Scholar
    • Export Citation
  • SmitBarry and Johanna Wandel. 2006. “Adaptation, Adaptive Capacity and Vulnerability.” Global Environmental Change 16 (3): 282292. doi:10.1016/j.gloenvcha.2006.03.008.

    • Search Google Scholar
    • Export Citation
  • TietenbergThomas. 2004. Environmental Economic and Policy. 4th ed. Boston: Pearson Addison-Wesley.

  • UN-DESA (United Nations Division for Sustainable Development). 1992. Agenda 21. UN Conference on Environment and DevelopmentRio de Janeiro3–14 June. https://sustainabledevelopment.un.org/content/documents/Agenda21.pdf.

    • Search Google Scholar
    • Export Citation
  • UNEP (United Nations Environment Programme). 2015. An Introduction to Environmental Assessment. Cambridge: UNEP World Conservation Monitoring Centre. http://wedocs.unep.org//handle/20.500.11822/7557.

    • Search Google Scholar
    • Export Citation
  • WalkerBrianLance GundersonAnn KinzigCarl FolkeSteve Carpenter and Lisen Schultz. 2006. “A Handful of Heuristics and Some Propositions for Understanding Resilience in Social-Ecological Systems.” Ecology and Society 11 (1): 8094.

    • Search Google Scholar
    • Export Citation
  • WellerSusan C. 2007. “Cultural Consensus Theory: Applications and Frequently Asked Questions.” Field Methods 19 (4): 339368. doi:10.1177/1525822X07303502.

    • Search Google Scholar
    • Export Citation
  • WhiteHoward D. 1983. “A Cocitation Map of the Social Indicators Movement.” Journal of the American Society for Information Science 34 (5): 307312. doi:10.1002/asi.4630340502.

    • Search Google Scholar
    • Export Citation
  • WilsonJeffreyPeter Tyedmers and Ronald Pelot. 2007. “Contrasting and Comparing Sustainable Development Indicator Metrics.” Ecological Indicators 7 (2): 299314. doi:10.1016/j.ecolind.2006.02.009.

    • Search Google Scholar
    • Export Citation
  • WongCecilia. 2003. “Indicators at the Crossroads: Ideas, Methods and Applications.” Town Planning Review 74 (3): 253279. doi:10.3828/tpr.74.3.1.

    • Search Google Scholar
    • Export Citation
  • WongCecilia. 2006. Indicators for Urban and Regional Planning: The Interplay of Policy and Methods. London: Routledge.

  • YoheGaryElizabeth MaloneAntoinette BrenkertMichael SchlesingerHenk MeijXiaoshi Xing and Daniel Lee. 2006. A Synthetic Assessment of the Global Distribution of Vulnerability to Climate Change from the IPCC Perspective That Reflects Exposure and Adaptive Capacity. Palisades, NY: Center for International Earth Science Information Network, Columbia University. http://sedac.ciesin.columbia.edu/mva/ccv/sagdreport.pdf.

    • Search Google Scholar
    • Export Citation
  • YoskowitzDavid and Marc Russell. 2015. “Human Dimensions of Our Estuaries and Coasts.” Estuaries and Coasts 38 (S1): 18. doi:10.1007/s12237-014-9926-y.

    • Search Google Scholar
    • Export Citation
List of Analyzed Reports and Publications (2000–2016)

Abt Associates. 2015. Developing Socio-economic Metrics to Measure DOI Hurricane Sandy Project and Program Outcomes. Contract #50937. Washington, DC: National Fish and Wildlife Foundation. https://www.doi.gov/sites/doi.gov/files/uploads/Socio_Economic_Metrics_Final_Report_11DEC2015_0.pdf.

Alessa, Lilian, Andrew Kliskey, Richard Lammers, Chris Arp, Dan White, Larry Hinzman, and Robert Busey. 2008. “The Arctic Water Resource Vulnerability Index: An Integrated Assessment Tool for Community Resilience and Vulnerability with Respect to Freshwater.” Environmental Management 42 (3): 523–541. USGS. doi:10.1007/s00267-008-9152-0.

Allen, Stuart, and Paul Bartram. 2008. Guam as a Fishing Community. Administrative Report H-08-01. Honolulu, HI: NOAA Pacific Islands Fisheries Science Center. https://www.pifsc.noaa.gov/library/pubs/admin/PIFSC_Admin_Rep_08-01.pdf.

Allen, Stuart, and Paul Bartram. 2008. Guam as a Fishing Community. Administrative Report H-08-01. Honolulu, HI: NOAA Pacific Islands Fisheries Science Center. https://www.pifsc.noaa.gov/library/pubs/admin/PIFSC_Admin_Rep_08-01.pdf

Aratame, Natsumi, and Joachim Singelmann. 2002. Socioeconomic Baseline Study for the Gulf of Mexico—Final Report: A Description of the Dataset, 1930–1990. OCS Study MMS 2002-054. New Orleans, LA: BOEM Gulf of Mexico OCS Region. https://www.boem.gov/ESPIS/3/3082.pdf.

Biedenweg, Kelly, Adi Hanein, Kara Nelson, Kari Stiles, Katharine Wellman, Julie Horowitz, and Stacy Vynne. 2014. “Developing Human Wellbeing Indicators in the Puget Sound: Focusing on the Watershed Scale.” Coastal Management 42 (4): 374–390. doi:10.1080/08920753.2014.923136. EPA and NOAA collaboration.

Bousquin, Justin, Kristen Hychka, and Marisa Mazzotta. 2015. Benefit Indicators for Flood Regulation Services of Wetlands: A Modeling Approach. EPA/600/R-15/191. Narragansett, RI: EPA National Health and Environmental Effects Research Laboratory, Atlantic Ecology Division. https://cfpub.epa.gov/si/si_public_file_download.cfm?p_download_id=525390.z.

Boyd, James, Paul Ringold, Alan Krupnick, Robert J. Johnston, Matthew A. Weber, and Kim Hall. 2015. Linking Indicators: Key Research Questions to Guide Decisions on What to Measure, Map and Model. Washington, DC: Resources for the Future. EPA National Health and Environmental Effects Research Laboratory, Western Ecology Division. http://www.rff.org/files/document/file/RFF-DP-15-40.pdf.

Breslow, Sara, Dan Holland, Phil Levin, Karma Norman, Melissa Poe, Cindy Thomson, Raz Barnea, et al. 2013. Human Dimensions of the CCIEA: A Summary of Concepts, Methods, Indicators, and Assessments. La Jolla, CA: NOAA Southwest Fisheries Science Center. https://swfsc.noaa.gov/publications/CR/2014/2014Breslow.pdf.

Bunce, Leah, Philip Townsley, Robert S. Pomeroy, and Richard Pollnac. 2000. Socioeconomic Manual for Coral Reef Management. Townsville: Australian Institute of Marine Science with NOAA and International Union for Conservation of Nature. http://www.icriforum.org/sites/default/files/GCRMN_Socioeconomic.pdf.

Campbell, Chris. 2011. Social Indicators in Coastal Alaska: Artic Communities. AK-11-09. Anchorage: BOEM Alaska OCS Region.

Chang, Heejun, Il-Won Jung, Angela Strecker, Daniel Wise, Martin Lafrenz, Vivek Shandas, Hamid Moradkhani, et al. 2013. “Water Supply, Demand, and Quality Indicators for Assessing the Spatial Distribution of Water Resource Vulnerability in the Columbia River Basin.” Atmosphere-Ocean 51 (4): 339–356. USGS. doi:10.1080/07055900.2013.777896.

Clay, Patricia M., Patricia Pinto da Silva, and Andrew Kitts. 2010. “Defining Social and Economic Performance Measures for Catch Share Systems in the Northeast U.S.” Paper presented at the 15th Biennial Conference of the International Institute of Fisheries Economics and Trade, Montpelier, France, 13–16 July. NOAA National Marine Fisheries Service.

Clay, Patricia M., Lisa L. Colburn, Julia Olson, Patricia Pinto da Silva, Sarah L. Smith, Azure Westwood, and Julie Ekstrom. 2010. Community Profiles for the Northeast US Fisheries. Woods Hole, MA: NOAA Northeast Fisheries Science Center, Social Sciences Branch. https://www.nefsc.noaa.gov/read/socialsci/pdf/communityProfiles/introduction.pdf.

Colburn, Lisa L., Patricia M. Clay, Tarsila Seara, Changhua Weng, and Angela Silva. 2015. “Social and Economic Impacts of Hurricane/Post Tropical Cyclone Sandy on the Commercial and Recreational Fishing Industries: New York and New Jersey One Year Later.” Technical Memorandum NMFS-F/SPO-157. Narragansett, RI: NOAA National Marine Fisheries Service. https://www.st.nmfs.noaa.gov/Assets/economics/documents/sandy/social-econ-hurricane-sandy.pdf.

Colburn, Lisa L., Michael Jepson, Changua Weng, Tarsila Seara, Jeremy Weiss, and Jonathan A. Hare. 2016. “Indicators of Climate Change and Social Vulnerability in Fishing Dependent Communities along the Eastern and Gulf Coasts of the United States.” Marine Policy 74: 323–333. doi:10.1016/j.marpol.2016.04.030.

Cutter, Susan L., Kevin D. Ash, and Christopher T. Emrich. 2014. “The Geographies of Community Disaster Resilience.” Global Environmental Change 29: 65–77. doi:10.1016/j.gloenvcha.2014.08.005. Extension collaboration with University.

Cutter, Susan L., Bryan J. Boruff, and W. Lynn Shirley. 2003. “Social Vulnerability to Environmental Hazards.” Social Science Quarterly 84 (2): 242–261. doi:10.1111/1540-6237.8402002. Extension collaboration with University.

Dillard, Maria K., Theresa L. Goedeke, Susan Lovelace, and Angela Orthmeyer. 2013. Monitoring Well-Being and Changing Environmental Conditions in Coastal Communities: Development of an Assessment Method. Technical Memorandum NOS NCCOS 174. Silver Spring, MD: NOAA National Centers for Coastal Ocean Science. http://www.coastalscience.noaa.gov/publications/handler.aspx?key=5367.

Dismukes, David E., Williams O. Olatubi, Dmitry V. Mesyanzhinov, and Allan G. Pulsipher. 2003. Modeling the Economic Impacts of Offshore Oil and Gas Activities in the Gulf of Mexico: Methods and Applications. OCS Study MMS 2003-018. New Orleans, LA: BOEM Gulf of Mexico OCS Region. https://www.boem.gov/ESPIS/2/3016.pdf.

Donatuto, Jamie, Eric E. Grossman, John Konovsky, Sarah Grossman, and Larry W. Campbell. 2014. “Indigenous Community Health and Climate Change: Integrating Biophysical and Science Indicators.” Coastal Management 42 (4): 355–373. doi:10.1080/08920753.2014.923140. USGS/EPA collaboration.

Ekstrom, Julia A., Lisa Suatoni, Sarah R. Cooley, Linwood H. Pendleton, George G. Waldbusser, Josh E. Cinner, Jessica Ritter, et al. 2015. “Vulnerability and Adaptation of US Shellfisheries to Ocean Acidification.” Nature Clime Change 5: 207–214. doi:10.1038/nclimate2508. NOAA/University collaboration.

EPA (Environmental Protection Agency). 2012. Indicators and Methods for Constructing a U.S. Human Well-Being Index (HWBI) for Ecosystem Services Research. EPA/600/R-12/023. Gulf Breeze, FL: EPA Office of Research and Development, Gulf Ecology Division. http://nepis.epa.gov/Exe/ZyPURL.cgi?Dockey=P100GRUA.txt.

EPA. 2014. Climate Change Indicators in the United States. 3rd ed. EPA 430-R-14-004. Washington, DC: EPA Office of Atmospheric Program.

EPA. 2015. Indicator Development for Estuaries. Washington, DC: EPA. https://www.epa.gov/sites/production/files/2015-09/documents/indicators_manual.pdf.

EPA. 2017. “USEPA Environmental Quality Index (EQI): Air, Water, Land, Built, and Sociodemographic Domains Transformed Variables Dataset as Input for the USEPA EQI, by County for the United States.” Data.gov, updated 23 May. https://catalog.data.gov/dataset/usepa-environmental-quality-index-eqi-air-water-land-built-and-sociodemographic-domains-transf.

EPA Office of Water. 2011. “A Rapid Screening Assessment of Brook Trout Recovery Potential in Mining-Impacted Middle Atlantic Region Watersheds.” https://www.epa.gov/sites/production/files/2015-11/documents/midatlprojsum110928.pdf.

EPA Office of Water. 2016a. “A Multi-Scale Screening Assessment of Recovery Potential in Maryland Watersheds.” https://www.epa.gov/sites/production/files/2015-11/documents/mdprojsum110928.pdf (accessed 30 April 2016).

EPA Office of Water. 2016b. “Comparing the Restorability of Illinois Impaired Waters: A Recovery Potential Pilot Study.” https://www.epa.gov/sites/production/files/2015-11/documents/ilprojsum1110928.pdf (accessed 30 April 2016).

EPA Office of Wetlands, Oceans, and Watersheds. 2002. Index of Watershed Indicators: An Overview. Washington, DC. http://mrwa.org/wp-content/uploads/repository/epa-indicators.pdf.

Fiksel, Joseph, Tarsha Eason, and Herbert Frederickson. 2013. A Framework for Sustainability Indicators at EPA. EPA/600/R/12/687. Washington, DC: EPA. http://nepis.epa.gov/Exe/ZyPURL.cgi?Dockey=P100FZZ7.txt.

Fleming, Chloe S., Flavia Tonioli, and Juan J. Agar. 2014. A Review of Principal Coastal Economic Sectors within the Southeast United States and U.S. Caribbean. Technical Memorandum NMFS-SEFSC-669. Miami, FL: NOAA National Marine Fisheries Service, Southeast Fishers Science Center. doi:10.7289/V5J10135.

Fox, William E., Daniel McCollum, John E. Mitchell, Louis E. Swanson, Urs P. Kreuter, John A. Tanaka, Gary R. Evans, et al. 2009. “An Integrated Social, Economic, and Ecologic Conceptual (ISEEC) Framework for Considering Rangeland Sustainability.” Society and Natural Resources 22 (7): 593–606. USGS.

FS (US Forest Service). 2004. National Report on Sustainable Forests—2003. FS-766. Washington, DC: USDA. http://www.fs.fed.us/research/sustain/docs/national-reports/2003/2003-sustainability-report.pdf.

Genskow, Ken, and Linda Prokopy, eds. 2011. The Social Indicator Planning and Evaluation System (SIPES) for Nonpoint Source Management: A Handbook for Watershed Projects. 3rd ed. Madison, WI: Great Lakes Regional Water Program. http://35.8.121.111/si/Info/pdfs/SI_Handbook_v4_02012012.pdf.

Grace-McCaskey, Cynthia A. 2014. Examining the Potential of Using Secondary Data to Better Understand Human-Reef Relationships across the Pacific. Administrative Report H-14-01. Honolulu, HI: NOAA Pacific Islands Fisheries Science Center. https://www.pifsc.noaa.gov/library/pubs/admin/PIFSC_Admin_Rep_14-01.pdf.

Grace-McCaskey, Cynthia A. 2012. “Development of Indicators for Measuring Effects of Human Activities on U.S. Pacific Coral Reefs.” Paper presented at the 12th International Coral Reef Symposium, Cairns, Australia, 9–13 July. https://www.pifsc.noaa.gov/library/pubs/grace-mccaskey_ca-icrs2012_22a_4_1.pdf.

Griffith, David, Manuel Valdés Pizzini, and Carlos García Quijano. 2007. Entangled Communities: Socioeconomic Profiles of Fishers, Their Communities, and Their Responses to Marine Protective Measures in Puerto Rico. Ed. J. J. Agar and B. Stoffle. Technical Memorandum NMFS-SEFSC-556. Miami, FL: NOAA Southeast Fisheries Science Center. https://repository.library.noaa.gov/view/noaa/4395.

Hall-Arber, Madeleine, Chris Dyer, John Poggie, James McNally, and Renee Gagne. 2002. New England’s Fishing Communities. A Final Report for Northeast Marine Fishers Initiative (MARFIN) Grant #NA87FF0547. MIT Sea Grant no. 01-15. Cambridge, MA: MIT Sea Grant. http://seagrant.mit.edu/cmss/marfin/toc.pdf.

Hastings, David. 2011. “The Human Security Index: Potential Roles for the Environmental and Earth Observation Communities. Earthzine, 4 May. NOAA.

Hastings, David. 2012. “The Human Security Index and National and Global Climate Assessments: How to Improve the Comparability of County-level Weather-Climate and Societal Indicators?” Presentation at the American Meteorological Society Annual Meeting, 22–26 January. https://ams.confex.com/ams/92Annual/webprogram/Paper195260.html.

Heinz Center. 2003. The Coastal Zone Management Act: Developing a Framework for Identifying Performance Indicators. NOAA Grant NA160Z1436. Washington, DC: Heinz Center. https://coast.noaa.gov/czm/media/heinzczmaframework.pdf.

Hicks, Christina C., Arielle Levine, Arun Agrawal, Xavier Basurto, Sara J. Breslow, Courtney Carothers, Susan Charnley, et al. 2016. “Engage Key Social Concepts for Sustainability.” Science 352 (6281): 38–40. doi:10.1126/science.aad4977. NOAA, Universities, USDA, and Conservation International collaboration.

Himes-Cornell, Amber, and Stephen Kasperski. 2015. “Assessing Climate Change Vulnerability in Alaska’s Fishing Communities.” Fisheries Research 162: 1–11. doi:10.1016/j.fishres.2014.09.010. NOAA.

Himes-Cornell, Amber, and Mike Orbach, with Stewart Allen, Guillermo Auad, Mary Boatman, Patricia M. Clay, Mike Dalton, et al. 2012. “Impacts of Climate Change on Human Uses of the Ocean.” In Oceans and Marine Resources in a Changing Climate: Technical Input to the 2013 National Climate Assessment, ed. Roger Griffis and Jennifer Howard, 64–118. Washington, DC: Island Press. NOAA.

Himes-Cornell, Amber, Christina Package, and Alison Durland. 2011. Improving Community Profiles for the North Pacific Fisheries. Technical Memorandum NMFS-AFSC-230. Seattle: NOAA Alaska Fisheries Science Center. https://www.afsc.noaa.gov/publications/afsc-tm/noaa-tm-afsc-230.pdf.

Hospital, Justin, and Courtney Beavers. 2012. Economic and Social Characteristics of Bottomfish Fishing in the Main Hawaiian Islands. Administrative Report H-12-01. Honolulu, HI: NOAA Pacific Islands Fisheries Science Center. https://www.pifsc.noaa.gov/library/pubs/admin/PIFSC_Admin_Rep_12-01.pdf.

Hospital, Justin, and Courtney Beavers. 2012. Economic and Social Characteristics of Guam’s Small Boat Fisheries. Administrative Report H-12-06. Honolulu, HI: NOAA Pacific Islands Fisheries Science Center. https://www.pifsc.noaa.gov/library/pubs/admin/PIFSC_Admin_Rep_12-06.pdf.

Hospital, Justin, and Courtney Beavers. 2014. Economic and Social Characteristics of Small Boat Fishing in the Commonwealth of the Northern Mariana Islands. Administrative Report H-14-02. Honolulu, HI: NOAA Pacific Islands Fisheries Science Center. https://www.pifsc.noaa.gov/library/pubs/admin/PIFSC_Admin_Rep_14-02.pdf.

Hospital, Justin, Skaidra Scholey Bruce, and Minling Pan. 2011. Economic and Social Characteristics of the Hawaii Small Boat Pelagic Fishery. Administrative Report H-11-01. Honolulu, HI: NOAA Pacific Islands Fisheries Science Center. https://www.pifsc.noaa.gov/library/pubs/admin/PIFSC_Admin_Rep_11-01.pdf.

Huang, Hongtai, and Timothy Barzyk. 2015. “Identification and Quantification of Cumulative Factors that Increase Environmental Exposures and Impacts.” Paper presented at the International Society of Exposure Science 2015 Annual Meeting, Henderson, NV, 18–22 October.

ICES (International Council for the Exploration of the Sea). 2011. Report of the Working Group on the Northwest Atlantic Regional Sea (WGNARS), 8–10 February 2011, Halifax, Canada. ICES CM 2011/SSGRSP:01. Copenhagen: ICES. http://www.ices.dk/sites/pub/Publication%20Reports/Expert%20Group%20Report/SSGRSP/2011/WGNARS11.pdf.

Impact Assessment, Inc. 2004. Identifying Communities Associated with the Fishing Industry in Louisiana. Contract WC133F-02-SE-0297. St. Petersburg, FL: NOAA Fisheries Southeast Regional Office. http://sero.nmfs.noaa.gov/sustainable_fisheries/social/documents/pdfs/communities/2013/ascension_lafayette.pdf.

Impact Assessment, Inc. 2005. Identifying Communities Associated with the Fishing Industry along the Florida Gulf Coast. Contract WC133F-02-SE-0298. St. Petersburg, FL: NOAA Fisheries Southeast Regional Office. http://sero.nmfs.noaa.gov/sustainable_fisheries/social/documents/pdfs/communities/2013/escambia_levy.pdf.

Impact Assessment, Inc. 2005. Identifying Communities Associated with the Fishing Industry in Texas. Contract WC133F-02-SE-0603. St. Petersburg, FL: NOAA Fisheries Southeast Regional Office. http://sero.nmfs.noaa.gov/sustainable_fisheries/social/documents/pdfs/communities/2013/texas.pdf.

Impact Assessment, Inc. 2006. Identifying Communities Associated with the Fishing Industry in Alabama and Mississippi. Contract WC133F-03-SE-0603. St. Petersburg, FL: NOAA Fisheries Southeast Regional Office. http://sero.nmfs.noaa.gov/sustainable_fisheries/social/documents/pdfs/communities/2013/alabama_mississippi.pdf.

Impact Assessment, Inc. 2006. Preliminary Assessment of the Impacts of Hurricane Katrina on Gulf of Mexico Coastal Fishing Communities. Contract WC133F-06-CN-0003. St. Petersburg, FL: NOAA Fisheries Southeast Regional Office.

Impact Assessment, Inc. 2007. Community Profiles and Socioeconomic Evaluations of Marine Conservation Districts: St. Thomas and St. John, U.S. Virgin Islands. Ed. J. J. Agar and B. Stoffle. NOAA Series on U.S. Caribbean Fishing Communities. Technical Memorandum NMFS-SEFSC-557. Miami, FL: NOAA Southeast Fisheries Science Center.

Jacob, Steve, Michael Jepson, Carlton Pomeroy, David Mulkey, Chuck Adams, and Suzanna Smith. 2001. Identifying Fishing Dependent Communities: Development and Confirmation of a Protocol. A MARFIN Project and Report. St. Petersburg, FL: NOAA Fisheries Southeast Regional Office. http://www.st.nmfs.noaa.gov/st1/econ/cia/FLFishingCommMARFINReport.pdf.

Jacob, Steve, Priscilla Weeks, Ben Blount, and Michael Jepson. 2013. “Development and Evaluation of Social Indicators of Vulnerability and Resiliency for Fishing Communities in the Gulf of Mexico.” Marine Policy 37: 86–95. doi:10.1016/j.marpol.2012.04.014. NOAA Sea Grant.

Jepson, Michael, and Lisa L. Colburn. 2013. Development of Social Indicators of Fishing Community Vulnerability and Resilience in the U.S. Southeast and Northeast Regions. Technical Memorandum NMFS-F/SPO-129. Silver Spring, MD: NOAA National Marine Fisheries Service. http://spo.nmfs.noaa.gov/tm/TM129.pdf.

Jepson, Michael, Kathi Kitner, Ana Pitchon, Wendy Wicke Perry, and Brent Stoffle. 2002. Potential Fishing Communities in the Carolinas, Georgia and Florida: An Effort in Baseline Profiling and Mapping. St. Petersburg, FL: NOAA Fisheries Southeast Regional Office. http://sero.nmfs.noaa.gov/sustainable_fisheries/social/documents/pdfs/communities/2013/s_atl_communities.pdf.

Karnauskas, Mandy, Michael J. Schirripa, Christopher R. Kelble, Geoffrey S. Cook, and J. Kevin Craig, eds. 2013. Ecosystem Status Report for the Gulf of Mexico. Technical Memorandum NMFS-SEFSC-653. Miami, FL: NOAA Southeast Fisheries Science Center. http://archive.gulfcouncil.org/docs/Gulf%20of%20Mexico%20Ecosystem%20Status%20Report.pdf.

Kenney, Melissa A., Julie Maldonado, Robert S. Chen, and Dale Quattrochi. 2013. Climate Change Impacts and Responses: Societal Indicators for the National Climate Assessment. NCA Report Series, Vol. 5c Washington, DC: US Global Change Research Program. doi:10.7916/D8C53KVJ.

Kruse, Jack, Marie Lowe, Sharman Haley, Ginny Fay, Larry Hamilton, and Matthew Berman. 2011. “Arctic Observing Network Social Indicators Project: Overview.” Polar Geography 34 (1–2): 1–8. BOEM and University collaboration. doi:10.1080/1088937X.2011.58446.

Lee, Donna J., Grace M. Johns, and Vernon R. Leeworthy. 2013. “Selecting Human Dimensions Economic Indicators for South Florida Coastal Marine Ecosystems.” Marine and Estuarine Goal Setting for South Florida (MARES) Whitepaper. Miami, FL: NOAA Atlantic Oceanographic and Meteorological Laboratory. http://www.aoml.noaa.gov/ocd/ocdweb/docs/MARES/MARES_WhitePaper9_SelectingHDSindicators_20130519.pdf.

Levine, Arielle, and Stewart Allen. 2009. American Samoa as a Fishing Community. Technical Memorandum NOAA-TM-NMFS-PIFSC-19. Honolulu, HI: NOAA Pacific Islands Fisheries Science Center. https://www.pifsc.noaa.gov/tech/NOAA_Tech_Memo_PIFSC_19.pdf.

Loper, Cindy, Robert Pomeroy, Vineeta Hoon, Patrick McConney, Maria Pena, Arie Sanders, Gaya Sriskanthan, et al. 2008. Socioeconomic Conditions along the World’s Tropical Coasts: 2008. SocMon Global Report. NOAA, Global Coral Reefing Monitoring Network, and Conservation International. http://www.conservation.org/publications/documents/CI_Marine_Socioeconomic_Conditions_Along_Worlds_Tropical_Coasts_2008.pdf.

Lovelace, Susan, and Maria Dillard. 2012. Developing Social and Economic Indicators for Monitoring the U.S. Coral Reef Jurisdictions: Report from a Scientific Workshop to Support the National Coral Reef Monitoring Program. Charleston, SC: NOAA Hollings Marine Laboratory and NOAA Coral Reef Conservation Program. https://data.nodc.noaa.gov/coris/library/NOAA/CRCP/project/626_Loper/Social_and_Economic_Indicators_for_Monitoring_the_U.S._Coral_Reef_Jurisdictions_Workshop_Report_2012.pdf.

Lovelace, Susan, Pamela Fletcher, Maria Dillard, William Nuttle, Shona Patterson, Peter Ortner, David Loomis, and Manoj Shivlani. 2013. Selecting Human Dimensions for South Florida’s Coastal Marine Ecosystem: Noneconomic Indicators. Marine and Estuarine Goal Setting for South Florida (MARES) Whitepaper. Miami, FL: NOAA Atlantic Oceanographic and Meteorological Laboratory.

Luke, Ronald T., Eric S. Schubert, Greg Olsson, and F. Larry Leistritz. 2002. Socioeconomic Baseline and Projections of the Impact of an OCS Onshore Base for Selected Florida Panhandle Communities, Volume 1: Final Report. OCS Study MMS 2002-024. New Orleans, LA: BOEM Gulf of Mexico OCS Region. https://www.boem.gov/ESPIS/2/3065.pdf.

Luton, Harry 2013. Subsistence in Coastal Louisiana: An Exploratory Study. GM-09-01-09. New Orleans, LA: BOEM Gulf of Mexico OCS Region. https://www.boem.gov/GM-09-01-09.

Luxton, Todd, David Carson, Gordon Evans, Mark Kemper, Kirk Scheckel, Stephen Wright, and Hale Thurston. 2014. Methods, Metrics, and Indicators Available for Identifying and Quantifying Economic and Social Impacts Associated with Beneficial Reuse Decisions: A Review of the Literature. EPA/600/R-14/237. Washington, DC: EPA Office of Research and Development. https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=294852.

MitFLG (Mitigation Framework Leadership Group), FEMA, and NOAA. 2016. Draft Interagency Concept for Community Resilience Indicators. MitFLG Draft Concept Paper Published for Stakeholder Comment. Washington, DC: Department of Homeland Security. https://www.fema.gov/media-library-data/1466085676217-a14e229a461adfa574a5d03041a6297c/FEMA-CRI-Draft-Concept-Paper-508_Jun_2016.pdf.

Newman, Peter, Robert Manning, and Bill Valliere. 2002. “Integrating Resource, Social and Managerial Indicators of Quality into Carrying Capacity Decision Making.” In Proceedings of the 2001 Northeastern Recreation Research, ed. Sharon Todd, 233–238. Newtown Square, PA: USDA Forest Service Northeastern Research Station. https://www.nrs.fs.fed.us/pubs/gtr/gtr_ne289/gtr_ne289_233.pdf.

NOAA (National Oceanic and Atmospheric Administration) Office for Coastal Management. (2011) 2016. Coastal Zone Management Act Performance Measurement System: Coastal Management Program Guidance. https://coast.noaa.gov/czm/media/czmapmsguide11.pdf

Norman, Karma, Jennifer Sepez, Heather Lazrus, Nicole Milne, Christina Package, Suzanne Russell, Kevin Grant, et al. 2007. Community Profiles for West Coast and North Pacific Fisheries: Washington, Oregon, California, and Other U.S. States. Technical Memorandum NMFS-NWFSC-85. Silver Spring, MD: NOAA National Marine Fisheries Service. https://www.nwfsc.noaa.gov/assets/25/499_01082008_153910_CommunityProfilesTM85WebFinalSA.pdf.

Package, Christina, and Jennifer Sepez. 2004. “Fishing Communities of the North Pacific: Social Science Research at the Alaska Fisheries Science Center.” AFSC Quarterly Research Report (April-May-June). NOAA Alaska Fisheries Science Center. http://www.afsc.noaa.gov/Quarterly/amj2004/amj04feat.pdf.

Palwan State University. 2012. Socioeconomic Monitoring (SocMon) Program in the Philippines to Support Effective Coral Reef Conservation and Coastal Resources Management: Initiation in Oriental Mindoro Province and Continuation in Puerto Princesa City, Palawan Province. Silver Spring, MD: NOAA. https://data.nodc.noaa.gov/coris/library/NOAA/CRCP/other/grants/International_FY10_Products/NA10NOS4630056_Philippines_Socmon.pdf.

Petterson, John S., Edward Glazier, Laura D. Stanley, Carson Mencken, Karl Eschbach, Patrick Moore, and Pamela Goode. 2008. Benefits and Burdens of OCS Activities on States, Labor Market Areas, Coastal Counties, and Selected Communities. OCS Study MMS 2008-052. New Orleans, LA: BOEM Gulf of Mexico OCS Region. https://www.boem.gov/ESPIS/4/4537.pdf.

Pollnac, Richard B., Tarsila Seara, Lisa L. Colburn, and Michael Jepson. 2015. “Taxonomy of USA East Coast Fishing Communities in Terms of Social Vulnerability and Resilience.” Environmental Impact Assessment Review 55: 136–143. NOAA. doi:10.1016/j.eiar.2015.08.006.

Puget Sound Partnership. 2016. “Scope of the Puget Sound Vital Signs.” https://pspwa.app.box.com/v/vitalsignscopeapr2016.

Reams, Margaret A., and Nina S.N. Lam. 2013. Socioeconomic Responses to Coastal Land Loss and Hurricanes: Measuring Resilience among Outer Continental Shelf-related Coastal Communities in Louisiana. OCS Study BOEM 2013-0111. New Orleans, LA: BOEM Gulf of Mexico OCS Region. https://www.boem.gov/ESPIS/5/5261.pdf.

Reeder, Richard J., and Dennis M. Brown. 2005. “Recreation, Tourism, and Rural Well-Being.” Economic Research Report No. 7. Washington, DC: USDA Economic Research Service. https://www.ers.usda.gov/webdocs/publications/46126/15112_err7_1_.pdf?v=41056.

Reedy-Maschner, Katherine, and Herbert Maschner. 2012. Subsistence Study for the North Aleutian Basin. OCS Study BOEM 2012-109. Anchorage: BOEM Alaska Region. https://www.boem.gov/ESPIS/5/5308.pdf.

Saleem, Maria. 2012. Socioeconomic Monitoring and Assessment for Coral Reef Management at Nassimo Thila and Banana Reef, Kaafu Atoll, Maldives. Project completion report NA10NOS4630055. Silver Spring, MD: NOAA. https://data.nodc.noaa.gov/coris/library/NOAA/CRCP/other/grants/International_FY10_Products/NA10NOS4630055_Maldives_Socmon.pdf.

Sempier, Tracie T., Don L. Swann, Rod L. Emmer, Stephen. H. Sempier, and Melissa Schneider. 2010. Coastal Community Resilience Index: A Community Self-Assessment—Understanding How Prepared Your Community Is for a Disaster. MASGP-08-014. Ocean Springs, MS: Mississippi-Alabama Sea Grant Consortium and NOAA Coastal Storms Program. http://masgc.org/assets/uploads/publications/662/coastal_community_resilience_index.pdf.

Sepez, Jennifer A., Bryan D. Tilt, Christina L. Package, Heather M. Lazrus, and Ismael Vaccaro. 2005. Community Profiles for North Pacific Fisheries: Alaska. Technical Memorandum NMFS-AFSC-160. Seattle: NOAA Alaska Fisheries Science Center. https://www.afsc.noaa.gov/Publications/AFSC-TM/NOAA-TM-AFSC-160/NOAA-TM-AFSC-160.pdf.

Sheppard, Stephen R. J., and Michael Meitner. 2005. “Using Multi-Criteria Analysis and Visualisation for Sustainable Forest Management Planning with Stakeholder Groups.” Forest Ecology and Management 207 (1–2): 171–187. doi:10.1016/j.foreco.2004.10.032.

Sherrouse, Benson C., Darius J. Semmens, and Jessica M. Clement. 2014. “An Application of Social Values for Ecosystem Services (SolVES) to Three National Forests.” Ecological Indicators 36: 68–79. USGS. doi:10.1016/j.ecolind.2013.07.008.

Slonecker, Terrence. A Landscape Indicator Approach to the Identification and Articulation of the Ecological Consequences of Land Cover Change in the Chesapeake Bay Watershed, 1970–2000. 2008. Reston, VA: US Geological Survey (USGS). https://pubs.usgs.gov/fs/2008/3056/fs2008-3056.pdf.

Smith, Lisa M., Jason L. Case, Heather M. Smith, Linda C. Harwell, and James K. Summers. 2013. “Relating Ecosystem Services to Domains of Human Well-Being: Foundation for a US Index.” Ecological Indicators 28: 79–90. doi:10.1016/j.ecolind.2012.02.032.

Smith, Lisa M., and Linda Harwell. 2013. Tampa’s Well-Being: A Demonstration of ORD’s Human Well-Being Index. Washington, DC: EPA Office of Research and Development.

Smith, Lisa M., Christina M. Wade, Jason L. Case, Linda C. Harwell, Kendra R. Straub, and James K. Summers. 2015. “Evaluating the Transferability of a U.S. Human Well-Being Index (HWBI) Framework to Native Americans Populations.” Social Indicators Research 124 (1): 157–182. EPA. doi:10.1007/s11205-014-0775-7.

Smith, Sarah L., Richard B. Pollnac, Lisa L. Colburn, and Julia Olson. 2011. “Classification of Coastal Communities Reporting Commercial Fish Landings in the US Northeast Region: Developing and Testing a Methodology.” Marine Fisheries Review 73 (2): 41–61. doi:

Solecki, William, Cynthia Rosenzweig, Reginald Blake, Alex de Sherbinin, Tom Matte, Fred Moshary, Bernice Rosenzweig, et al. 2015. “New York City Panel on Climate Change 2015 Report Chapter 6: Indicators and Monitoring.” NOAA. Annals of the New York Academy of Sciences 1336 (1): 89–106. doi:10.1111/nyas.12587.

SRWP (Sacramento River Watershed Program). 2010. Sacramento River Basin Report Card and Technical Report: Feather River Watershed. Chico, CA: SWRP. https://indicators.ucdavis.edu/waf/files/WHIP_TechRep_2010_0.pdf.

Summers, Kevin, Lisa M. Smith, Linda Harwell, and Kyle Buck. 2016. “Development of a Climate Resilience Screening Index (CRSI) and Its Potential for Application in the U.S.” Presentation at the International Conference of the Society for Human Ecology, Santa Ana, CA, 12–15 April. EPA. https://pspwa.app.box.com/v/vitalsignscopeapr2016.

Thering, S. 2005. Developing Indicators of Community Capacity and Documenting the Community Capacity Benefits of Citizen Participation. Project No. WIS04617. USDA Research, Education and Economics Information System. http://www.reeis.usda.gov/web/crisprojectpages/0195486-developing-indicators-of-community-capacity-and-documenting-the-community-capacity-benefits-of-citizen-participation.html.

USGCRP (US Global Change Research Program) Social Sciences Task Force. Social Sciences Interaction to Support USGCRP Strategic Plan Implementation. Washington, DC: USGCRP. http://www.globalchange.gov/sites/globalchange/files/SSTF-White-Paper-Final.pdf.

Villamagna, Amy M., Beatriz Mogollón, and Paul L. Angermeier. 2014. “A Multi-Indicator Framework for Mapping Cultural Ecosystem Services: The Case of Freshwater Recreational Fishing.” Ecological Indicators 45: 255–265. USGS. doi:10.1016/j.ecolind.2014.04.001.

Weber, Matthew A., and Paul L. Ringold. 2015. Priority River Metrics for Residents of an Urbanized Arid Watershed. Landscape and Urban Planning 133: 37–52. EPA. doi:10.1016/j.landurbplan.2014.09.006.

Wongbusarakum, Supin, and Christy Loper. 2011. Indicators to Assess Community-level Social Vulnerability to Climate Change: An Addendum to SocMon and SEM-Pasifika Regional Socioeconomic Monitoring Guidelines. Washington, DC: NOAA Global Socioeconomic Monitoring Initiative for Coastal Management (SocMon).

If the inline PDF is not rendering correctly, you can download the PDF file here.

Contributor Notes

VICTORIA RAMENZONI is an environmental anthropologist. She was a Knauss Marine Policy Fellow at the NOAA (2014), and a contractor in the Office of Program Planning Information, NOAA (2015). She also served as an executive secretary for the Interagency Working Group on Ocean Social Science, Office of Science and Technology, National Ocean Council, White House (2014–2015). E-mail: victoria.ramenzoni@tamucc.edu

DAVID YOSKOWITZ’s research and policy work centers on environmental, ecological, and natural resource economics, as well as microeconomic development and border economics. He was NOAA’s chief economist from 2014 to 2015, where he worked in the consolidation of social sciences at the federal level. E-mail: david.yoskowitz@tamucc.edu

Environment and Society

Advances in Research

  • View in gallery

    Major Characteristics of Social Indicator Reports

    Note: Main characteristics of literature review findings. A total of 204 documents were identified, of which 97 constituted reports. After cleaning for overlaps, the total number of reports analyzed was 88. The majority of reports originated from the NOAA, with an important increase in the number of documents in 2011.

  • View in gallery

    Scree Plot of Unique Indicators

    Note: This graph displays the frequency in which unique indicators are mentioned. Observe the different “elbows” or discontinuities marked by the arrows in the curve. The fourth discontinuity is used as the criteria for content and similarity analyses.

  • View in gallery

    Multidimensional Scaling Plot of 52 Most Mentioned Indicators

    Note: This figure presents a two-dimensional spatial analysis of similarities among reports. Each point represents one of the 52 most mentioned indicators. The distance between points reflects similarity among the indicators, with points closer together representing items that are often included together in indicator frameworks. The stress level for the MDS was 0.0086.

  • View in gallery

    Cluster Analysis of Indicators

    Note: Dendrogram displaying the clustering of 52 indicators. The different branches represent structures around which indicators cluster according to their similarity. Observe that there is an unequal distribution in the number of indicators in these structures. This may respond to the differences in how some constructs are operationalized in the reports. In addition, parenthetic structures may show some overlap with the theoretical frameworks discussed previously.

  • AndrewsFrank M. 1989. “The Evolution of a Movement.” Journal of Public Policy 9 (4): 401405. doi:10.1017/S0143814X00008242.

  • AndrewsFrank M. and Stephen B. Withey. 2012. Social Indicators of Well-Being: Americans’ Perceptions of Life Quality. Berlin: Springer Science and Business Media.

    • Search Google Scholar
    • Export Citation
  • AtkinsonTonyBea CantillonEric Marlier and Brian Nolan. 2002. Social Indicators: The EU and Social Inclusion. Oxford: Oxford University Press.

    • Search Google Scholar
    • Export Citation
  • AustinDianeBrian MarksKelly AmesTom McGuireBen McMahanVictoria PhaneufPreetam PrakashBethany RogersCarolyn Ware and Justina Whalen. 2014. Offshore Oil and Deepwater Horizon: Social Effects on Gulf Coast Communities—Volume I: Methodology Timeline Context and Communities. OCS Study BOEM 2014-617. New Orleans, LA: BOEM Gulf of Mexico OCS Region. http://www.data.boem.gov/PI/PDFImages/ESPIS/5/5384.pdf.

    • Search Google Scholar
    • Export Citation
  • BadhamMarnie. 2009. “Cultural Indicators: Tools for Community Engagement?International Journal of the Arts in Society 3 (5): 6776.

    • Search Google Scholar
    • Export Citation
  • BarnettJonSimon Lambert and Ian Fry. 2008. “The Hazards of Indicators: Insights from the Environmental Vulnerability Index.” Annals of the Association of American Geographers 98 (1): 102119. doi:10.1080/00045600701734315.

    • Search Google Scholar
    • Export Citation
  • BauerRaymond A. 1966. Social Indicators. Cambridge, MA: MIT Press.

  • BergaminiNadiaRobert BlasiakPablo EyzaguirreKaoru IchikawaDunja MijatovicFumiko NakaoSuneetha M. Subramanian. 2013. Indicators of Resilience in Socio-ecological Production Landscapes (SEPLs). Yokohama: United Nations University Institute of Advanced Studies.

    • Search Google Scholar
    • Export Citation
  • Berger-SchmittRegina and Heinz-Herbert Noll. 2000. Conceptual Framework and Structure of a European System of Social Indicators. EuReporting Working Paper No. 9. Mannheim: Centre for Survey Research and Methodology (ZUMA). http://www.gesis.org/fileadmin/upload/dienstleistung/daten/soz_indikatoren/eusi/paper9.pdf.

    • Search Google Scholar
    • Export Citation
  • BerkesFikret. 2006. “From Community-based Resource Management to Complex Systems: The Scale Issue and Marine Commons.” Ecology and Society 11 (1): art. 45. doi:10.5751/ES-01431-110145.

    • Search Google Scholar
    • Export Citation
  • BernardH. Russell. 2006. Research Methods in Anthropology: Qualitative and Quantitative Approaches. 4th ed. Lanham, MD: AltaMira Press.

    • Search Google Scholar
    • Export Citation
  • BiggsReinetteMaja SchlüterDuan BiggsErin L. BohenskyShauna Burn SilverGeorgina CundillVasilis Dakos et al. 2012. “Toward Principles for Enhancing the Resilience of Ecosystem Services.” Annual Review of Environment and Resources 37: 421448. doi:10.1146/annurev-environ-051211-123836.

    • Search Google Scholar
    • Export Citation
  • BoothAndrewDiana Papaioannou and Anthea Sutton. 2012. Systematic Approaches to a Successful Literature Review. Los Angeles: Sage.

  • BorgattiStephen P. and Inga Carboni. 2007. “On Measuring Individual Knowledge in Organizations.” Organizational Research Methods 10 (3): 449462. doi:10.1177/1094428107300228.

    • Search Google Scholar
    • Export Citation
  • BowenRobert E. and Cory Riley. 2003. “Socio-economic Indicators and Integrated Coastal Management.” Ocean and Coastal Management 46 (3–4): 299312. doi:10.1016/S0964-5691(03)00008-5.

    • Search Google Scholar
    • Export Citation
  • BridgesToddCharley ChesnuttRoselle HennPaul WagnerCamley WaltersTy Wamsley and Kate White. 2013. Coastal Risk Reduction and Resilience. New York: US Army Corps of Engineers. http://www.swg.usace.army.mil/Portals/26/docs/PAO/Coastal.pdf.

    • Search Google Scholar
    • Export Citation
  • BrownBrett V. and Thomas Corbett. 1997. Social Indicators and Public Policy in the Age of Devolution. Special Report no. 71. Madison: Institute for Research on Poverty, University of Wisconsin–Madison. http://www.irp.wisc.edu/publications/sr/pdfs/sr71.pdf.

    • Search Google Scholar
    • Export Citation
  • CarpenterAnn. 2013. Social Ties Space and Resilience: Literature Review of Community Resilience to Disasters and Constituent Social and Built Environmental Factors. Community and Economic Development Discussion Paper No. 02-13. Atlanta: Federal Reserve Bank of Atlanta. https://www.frbatlanta.org/-/media/documents/community-development/publications/discussion-papers/2013/02-literature-review-of-community-resilience-to-disasters-2013-09-25.pdf.

    • Search Google Scholar
    • Export Citation
  • CarpenterSteveBrian WalkerJ. Marty Anderies and Nick Abel. 2001. “From Metaphor to Measurement: Resilience of What to What?Ecosystems 4: 765781. doi:10.1007/s10021-001-0045-9.

    • Search Google Scholar
    • Export Citation
  • CarrigerJohn F.Stephen J. JordanJanis C. Kurtz and William H. Benson. 2015. “Identifying Evaluation Considerations for the Recovery and Restoration from the 2010 Gulf of Mexico Oil Spill: An Initial Appraisal of Stakeholder Concerns and Values.” Integrated Environmental Assessment and Management 11 (3): 502513. doi:10.1002/ieam.1615.

    • Search Google Scholar
    • Export Citation
  • ClarkeBeverleyLaura StockerBrian CoffeyPeat LeithNick HarveyClaudia BaldwinTom Baxter et al. 2013. “Enhancing the Knowledge-Governance Interface: Coasts, Climate and Collaboration.” Ocean and Coastal Management 86: 8899. doi:10.1016/j.ocecoaman.2013.02.009.

    • Search Google Scholar
    • Export Citation
  • ClayPatricia M.Andrew Kitts and Patricia Pinto da Silva. 2014. “Measuring the Social and Economic Performance of Catch Share Programs: Definition of Metrics and Application to the US Northeast Region Groundfish Fishery.” Marine Policy 44: 2736. doi:10.1016/j.marpol.2013.08.009.

    • Search Google Scholar
    • Export Citation
  • ClayPatricia M. and Julia Olson. 2008. “Defining ‘Fishing Communities’: Vulnerability and the Magnuson-Stevens Fishery Conservation and Management Act.” Human Ecology Review 15 (2): 143160.

    • Search Google Scholar
    • Export Citation
  • CobbClifford W. and Craig Rixford. 1998. Lessons Learned from the History of Social Indicators. Vol. 1. San Francisco: Redefining Progress.

    • Search Google Scholar
    • Export Citation
  • CohenWilbur J. 1969. Toward a Social Report. ERIC no. ED054039. Washington, DC: Department of Health, Education, and Welfare. http://eric.ed.gov/?id=ED054039.

    • Search Google Scholar
    • Export Citation
  • CropperMaureen L. and Wallace E. Oates. 1992. “Environmental Economics: A Survey.” Journal of Economic Literature 30 (2): 675740.

    • Search Google Scholar
    • Export Citation
  • CutterSusan L. and Christopher T. Emrich. 2006. “Moral Hazard, Social Catastrophe: The Changing Face of Vulnerability along the Hurricane Coasts.” The ANNALS of the American Academy of Political and Social Science 604 (1): 102112. doi:10.1177/0002716205285515.

    • Search Google Scholar
    • Export Citation
  • CutterSusan L.Christopher T. EmrichJennifer J. Webb and Daniel Morath. 2009. Social Vulnerability to Climate Variability Hazards: A Review of the Literature. Final Report to Oxfam America. Columbia: Hazards and Vulnerability Research Institute, University of South Carolina.

    • Search Google Scholar
    • Export Citation
  • DahlArthur Lyon. 2012. “Achievements and Gaps in Indicators for Sustainability.” Ecological Indicators 17: 1419. doi:10.1016/j.ecolind.2011.04.032.

    • Search Google Scholar
    • Export Citation
  • DaviesPhilip. 2012. “The State of Evidence-based Policy Evaluation and Its Role in Policy Formation.” National Institute Economic Review 219 (1): R41R52. doi:10.1177/002795011221900105.

    • Search Google Scholar
    • Export Citation
  • EakinHallie and Amy Lynd Luers. 2006. “Assessing the Vulnerability of Social-Environmental Systems.” Annual Review of Environment and Resources 31: 365394. doi:10.1146/annurev.energy.30.050504.144352.

    • Search Google Scholar
    • Export Citation
  • FerrissAbbott L. 1979. “The U.S. Federal Effort in Developing Social Indicators.” Social Indicators Research 6 (2): 129152.

  • FeurtChristine. 2006. Cultural Models: A Tool for Enhancing Communication and Collaboration in Coastal Resources Management. NOAA Grant No. NA03NOS4190195. Wells, ME: Wells National Estuarine Research Reserve. http://www.wellsreserve.org/writable/files/microsoft_word_-_cultural_models_primer.pdf.

    • Search Google Scholar
    • Export Citation
  • “FFO-2015.” 2016. NOAA RESTORE Act Science Program. https://restoreactscienceprogram.noaa.gov/funding/ffo-2015 (accessed 1 September 2016).

    • Search Google Scholar
    • Export Citation
  • ForceJo Ellen and Gary E. Machlis. 1997. “The Human Ecosystem Part II: Social Indicators in Ecosystem Management.” Society and Natural Resources 10 (4): 369382. doi:10.1080/08941929709381035.

    • Search Google Scholar
    • Export Citation
  • FoucaultMichel. 2007. Security Territory Population: Lectures at the College De France 1977–1978. Basingstoke: Palgrave Macmillan.

    • Search Google Scholar
    • Export Citation
  • GariSirak R.Alice Newton and John D. Icely. 2015. “A Review of the Application and Evolution of the DPSIR Framework with an Emphasis on Coastal Social-Ecological Systems.” Ocean and Coastal Management 103: 6377. doi:10.1016/j.ocecoaman.2014.11.013.

    • Search Google Scholar
    • Export Citation
  • GergenKenneth J. 1973. “Social Psychology as History.” Journal of Personality and Social Psychology 26 (2): 309320.

  • GreenMaria. 2001. “What We Talk about When We Talk about Indicators: Current Approaches to Human Rights Measurement.” Human Rights Quarterly 23 (4): 10621097. doi:10.1353/hrq.2001.0054.

    • Search Google Scholar
    • Export Citation
  • HackingIan. 1990. The Taming of Chance. Cambridge: Cambridge University Press.

  • HákTomásBedrich Moldan and Arthur Lyon Dahl. 2012. Sustainability Indicators: A Scientific Assessment. Washington, DC: Island Press.

    • Search Google Scholar
    • Export Citation
  • HollingCrawford S. 1973. “Resilience and Stability of Ecological Systems.” Annual Review of Ecology and Systematics 4: 123.

  • HouYingShudong ZhouBenjamin Burkhard and Felix Müller. 2014. “Socioeconomic Influences on Biodiversity, Ecosystem Services and Human Well-Being: A Quantitative Application of the DPSIR Model in Jiangsu, China.” Science of the Total Environment 490: 10121028. doi:10.1016/j.scitotenv.2014.05.071.

    • Search Google Scholar
    • Export Citation
  • HuetingRoefie and L. Reijnders. 2004. “Broad Sustainability Contra Sustainability: The Proper Construction of Sustainability Indicators.” Ecological Economics 50 (3–4): 249260. doi:10.1016/j.ecolecon.2004.03.031.

    • Search Google Scholar
    • Export Citation
  • IFPRI (International Food Policy Research Institute). 2015. 2014–2015 Global Food Policy Report. Washington, DC: IFPRI. doi:10.2499/9780896295759.

    • Search Google Scholar
    • Export Citation
  • InnesJudith Eleanor. 1975. Social Indicators and Public Policy: Interactive Processes of Design and Application. Amsterdam: Elsevier.

  • InnesJudith Eleanor. 1989. “Disappointments and Legacies of Social Indicators.” Journal of Public Policy 9 (4): 429432. doi:10.1017/S0143814X00008291.

    • Search Google Scholar
    • Export Citation
  • InnesJudith Eleanor and David E. Booher. 2000. “Indicators for Sustainable Communities: A Strategy Building on Complexity Theory and Distributed Intelligence.” Planning Theory and Practice 1 (2): 173186. doi:10.1080/14649350020008378.

    • Search Google Scholar
    • Export Citation
  • InnesJudith Eleanor and David E. Booher. 2010. Planning with Complexity: An Introduction to Collaborative Rationality for Public Policy. London: Routledge.

    • Search Google Scholar
    • Export Citation
  • JacobSteve and Michael Jepson. 2009. “Creating a Community Context for the Fishery Stock Sustainability Index.” Fisheries 34 (5): 228231. doi:10.1577/1548-8446-34.5.228.

    • Search Google Scholar
    • Export Citation
  • JohnsGraceDonna J. LeeVernon LeeworthyJoseph Boyer and William Nuttle. 2014. “Developing Economic Indices to Assess the Human Dimensions of the South Florida Coastal Marine Ecosystem Services.” Ecological Indicators 44: 6980. doi:10.1016/j.ecolind.2014.04.014.

    • Search Google Scholar
    • Export Citation
  • KristensenPeter. 2004. “The DPSIR Framework.” Paper presented at a United Nations Environment Programme workshopNairobi, Kenya27–29 September. http://wwz.ifremer.fr/dce_eng/content/download/69291/913220/file/DPSIR.pdf.

    • Export Citation
  • LandKenneth C.Alex C. Michalos and Joseph Sirgy eds.. 2011. Handbook of Social Indicators and Quality of Life Research. Berlin: Springer Science and Business Media.

    • Search Google Scholar
    • Export Citation
  • LandKenneth C. and Seymour Spilerman eds. 1975. Social Indicator Models. New York: Russell Sage Foundation.

  • Le GentilEric and Rémi Mongruel. 2015. “A Systematic Review of Socio-Economic Assessments in Support of Coastal Zone Management (1992–2011).” Journal of Environmental Management 149: 8596. doi:10.1016/j.jenvman.2014.10.018.

    • Search Google Scholar
    • Export Citation
  • MacDowallLachlanMarnie BadhamEmma Blomkamp and Kim Dunphy eds. 2016. Making Culture Count: The Politics of Cultural Measurement. Berlin: Springer.

    • Search Google Scholar
    • Export Citation
  • MaloneyJohn C. 1968. “Review of Review of Social Indicators, by Raymond A. Bauer.” Journal of Business 41 (1): 115118.

  • MartínezJavier and Emile Dopheide. 2014. “Indicators: From Counting to Communicating.” Journal for Education in the Built Environment 9 (1): 119. doi:10.11120/jebe.2014.00009.

    • Search Google Scholar
    • Export Citation
  • McBainDarian and Ali Alsamawi. 2014. “Quantitative Accounting for Social Economic Indicators.” Natural Resources Forum 38 (3): 193202. doi:10.1111/1477-8947.12044.

    • Search Google Scholar
    • Export Citation
  • MEA (Millennium Ecosystem Assessment). 2005. Ecosystems and Human Well-Being: Synthesis. Washington, DC: Island Press. http://www.millenniumassessment.org/documents/document.356.aspx.pdf.

    • Search Google Scholar
    • Export Citation
  • NollHeinz-Herbert. 2002. “Towards a European System of Social Indicators: Theoretical Framework and System Architecture.” Social Indicators Research 58 (1–3): 4787.

    • Search Google Scholar
    • Export Citation
  • NollHeinz-Herbert. 2004. “Social Indicators and Quality of Life Research: Background, Achievements and Current Trends.” In Advances in Sociological Knowledge: Over Half a Century ed. Nikolai Genov151181. Berlin: Springer.

    • Search Google Scholar
    • Export Citation
  • OstromElinor. 2009. “A General Framework for Analyzing Sustainability of Social-Ecological Systems.” Science 325 (5939): 419422. doi:10.1126/science.1172133.

    • Search Google Scholar
    • Export Citation
  • PettersonJohn S.Edward GlazierLaura D. StanleyCarson MenckenKarl EschbachPatrick Moore and Pamela Goode. 2008. Benefits and Burdens of OCS Activities on States Labor Market Areas Coastal Counties and Selected Communities. OCS Study MMS 2008-052. New Orleans, LA: BOEM Gulf of Mexico OCS Region. https://www.boem.gov/ESPIS/4/4537.pdf.

    • Search Google Scholar
    • Export Citation
  • PollnacRichard B.Susan Abbott-JamiesonCourtney SmithMarc L. MillerPatricia M. Clay and Bryan Oles. 2006. “Toward a Model for Fisheries Social Impact Assessment.” Marine Fisheries Review 68 (1–4): 118.

    • Search Google Scholar
    • Export Citation
  • SawickiDavid S. 2002. “Improving Community Indicator Systems: Injecting More Social Science into the Folk Movement.” Planning Theory and Practice 3 (1): 1332. doi:10.1080/14649350220117780.

    • Search Google Scholar
    • Export Citation
  • ScottJames C. 1998. Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed. New Haven, CT: Yale University Press.

    • Search Google Scholar
    • Export Citation
  • SheldonEleanor B. and Howard E. Freeman. 1970. “Notes on Social Indicators: Promises and Potential.” Policy Sciences 1 (1): 97111. doi:10.1007/BF00145195.

    • Search Google Scholar
    • Export Citation
  • SmitBarry and Johanna Wandel. 2006. “Adaptation, Adaptive Capacity and Vulnerability.” Global Environmental Change 16 (3): 282292. doi:10.1016/j.gloenvcha.2006.03.008.

    • Search Google Scholar
    • Export Citation
  • TietenbergThomas. 2004. Environmental Economic and Policy. 4th ed. Boston: Pearson Addison-Wesley.

  • UN-DESA (United Nations Division for Sustainable Development). 1992. Agenda 21. UN Conference on Environment and DevelopmentRio de Janeiro3–14 June. https://sustainabledevelopment.un.org/content/documents/Agenda21.pdf.

    • Search Google Scholar
    • Export Citation
  • UNEP (United Nations Environment Programme). 2015. An Introduction to Environmental Assessment. Cambridge: UNEP World Conservation Monitoring Centre. http://wedocs.unep.org//handle/20.500.11822/7557.

    • Search Google Scholar
    • Export Citation
  • WalkerBrianLance GundersonAnn KinzigCarl FolkeSteve Carpenter and Lisen Schultz. 2006. “A Handful of Heuristics and Some Propositions for Understanding Resilience in Social-Ecological Systems.” Ecology and Society 11 (1): 8094.

    • Search Google Scholar
    • Export Citation
  • WellerSusan C. 2007. “Cultural Consensus Theory: Applications and Frequently Asked Questions.” Field Methods 19 (4): 339368. doi:10.1177/1525822X07303502.

    • Search Google Scholar
    • Export Citation
  • WhiteHoward D. 1983. “A Cocitation Map of the Social Indicators Movement.” Journal of the American Society for Information Science 34 (5): 307312. doi:10.1002/asi.4630340502.

    • Search Google Scholar
    • Export Citation
  • WilsonJeffreyPeter Tyedmers and Ronald Pelot. 2007. “Contrasting and Comparing Sustainable Development Indicator Metrics.” Ecological Indicators 7 (2): 299314. doi:10.1016/j.ecolind.2006.02.009.

    • Search Google Scholar
    • Export Citation
  • WongCecilia. 2003. “Indicators at the Crossroads: Ideas, Methods and Applications.” Town Planning Review 74 (3): 253279. doi:10.3828/tpr.74.3.1.

    • Search Google Scholar
    • Export Citation
  • WongCecilia. 2006. Indicators for Urban and Regional Planning: The Interplay of Policy and Methods. London: Routledge.

  • YoheGaryElizabeth MaloneAntoinette BrenkertMichael SchlesingerHenk MeijXiaoshi Xing and Daniel Lee. 2006. A Synthetic Assessment of the Global Distribution of Vulnerability to Climate Change from the IPCC Perspective That Reflects Exposure and Adaptive Capacity. Palisades, NY: Center for International Earth Science Information Network, Columbia University. http://sedac.ciesin.columbia.edu/mva/ccv/sagdreport.pdf.

    • Search Google Scholar
    • Export Citation
  • YoskowitzDavid and Marc Russell. 2015. “Human Dimensions of Our Estuaries and Coasts.” Estuaries and Coasts 38 (S1): 18. doi:10.1007/s12237-014-9926-y.

    • Search Google Scholar
    • Export Citation