1. Introduction

New strategies for managing the marine commons are needed in response to degraded marine ecosystems and the fish and human populations they support (Berkes 2006; Berkes et al. 2006; Wilson 2007). Recognizing environmental uncertainty and ecosystem complexity, many scholars advocate more adaptive or ecosystem-based approaches (Hilborn 2004; Pikitch et al. 2004; Sissenwine and Murawski 2004; Castilla and Defeo 2005). Such adaptive governance approaches require good, reliable sources of information about stocks, flows, and processes at scales beyond those of large-scale traditional data collection (Dietz et al. 2003; Wilson 2006; Wilson 2007). Fishers’ local knowledge is considered to be one such information source (Berkes et al. 2000; Drew 2005). Although many scholars have demonstrated the value of fishers’ knowledge and advocated its use, fishers’ knowledge has generally played a limited role in science and management (Neis et al. 1999, but see Haggan et al. 2007; Pálsson 2000). Wilson (2007) cautions that the lack of connection between local, qualitative fishers’ knowledge and experiences and large-scale, quantitative scientific-based management approaches hinders the learning and adaptation necessary for effective governance of the marine commons. As in other science policy domains, communication between scientists and non-scientists is often hindered by pragmatic, institutional, and cultural factors even though non-scientific citizens are often recognized as having valuable knowledge to contribute (Irwin 1995; Fischer 2000; Wilson and Degnbol 2002; Backstrand 2003; Wilson 2003; Carr 2004), including local or traditional ecological knowledge (Berkes et al. 2000; Murray et al. 2006) and experience-based expertise (Collins and Evans 2007).

The lack of communication and knowledge exchange between fishers and scientists, as well as managers, has often been at the root of fisheries management conflicts (Finlayson 1994; Smith 1995; Dobbs 2000; Hanna et al. 2000; Weber 2002). This is particularly true for the large-scale commercial groundfish fisheries in the Northeast U.S. As participants in a co-management system (Pomeroy and Berkes 1997), scientists, fishers, and managers collectively failed to devise rules to effectively manage this common-pool resource. With few opportunities to contribute their knowledge, which was often dismissed as irrelevant or anecdotal, fishers grew distrustful of scientists’ stock assessments and resulting management decisions (Hartley and Robertson 2006). In particular, fishers were concerned about the credibility of the government’s resource survey that produced assessments and management advice that conflicted with their experience and observations (Dobbs 2000; Weber 2002). Scientists were unable to convince managers and the industry of the need to reduce fishing effort and, consequently, effective rules to reduce fishing mortality were not implemented (Dobbs 2000; Weber 2002). Scientists similarly grew distrustful due to the industry’s dismissal of their scientific expertise, increasing the barrier to communication and knowledge exchange (Hanna et al. 2000). This “great divide” between fishers and scientists, rooted in differential experiences and observations, remains a critical impediment to successful collective action and marine conservation in this region.

Ostrom (2005) and others suggest that mutual understanding of resource conditions can increase the likelihood that stakeholders will organize and agree upon common rules for resource management. One approach to linking the local knowledge and experiences of fishers with the large-scale efforts of science-based management (and hopefully creating mutual understanding about the condition of the resources) is to include fishers in scientific research in what is known as cooperative (or collaborative) research (e.g. NRC 2004; Reid and Hartley 2006; Johnson and van Densen 2007). Cooperative research refers simply to fishers and scientists engaged together in scientific research, with partnerships ranging from chartering vessels to the full industry integration in all stages of research (Bernstein and Iudicello 2000; Karp et al. 2001; NRC 2004; Conway and Pomeroy 2006; McCay et al. 2006; Johnson and van Densen 2007). By recognizing the expertise of different stakeholders (scientists and non-scientists), increasing transparency of the research process, and improving relationships and trust among fishers, scientists, and managers through participation, cooperative research is expected to improve the effectiveness of marine commons management (Kaplan and McCay 2004; Johnson and van Densen 2007).

This paper investigates the interactions between fishers and scientists in the case of cooperative research in the Northeast U.S. Cooperative research emerged in this region beginning in the late 1990s in response to the regional fisheries crisis, including stock depletion and socioeconomic hardship, as well as an adversarial political climate (Dobbs 2000; Hartley and Robertson 2006). The U.S. Congress allocated millions of dollars to fund industry-science cooperative research with the intent of improving industry-science relations, improving the knowledge base of fisheries management, and providing supplemental income to fishers impacted by the fisheries crisis. While these funds focused on the New England groundfish fisheries1, similar efforts emerged in the Mid-Atlantic surfclam (Spisula solidissima) and ocean quahog (Arctica islandica) fishery, the Illex and Loligo squid fisheries, and the sea scallop (Placopecten magellanicus) fishery. In the Northeast region, a form of cooperative research emerged that goes beyond past efforts of simply chartering fishers’ vessels as research platforms and instead involves fishers in meaningful ways throughout the scientific research process (McCay et al. 2006; Johnson and van Densen 2007).

Specifically, this paper examines closely the interactions between scientists and fishers (and their knowledge) in cooperative fisheries research, conceptualized in terms of flow of knowledge or exchange of information across a socially constructed boundary between science and fishers’ knowledge (Figure 1). The concept of the boundary is complex; I have borrowed and adapted the concept from the field of science studies. In that literature, the boundary between science and non-science is considered the result of “boundary-work”, a discursive process where select qualities are assigned to scientists, scientific methods, and scientific claims (Jasanoff 1987; Gieryn 1999; Guston 2001). As an example, a boundary is typically constructed between fishers’ and scientists’ knowledge when the former is considered “lay” or “anecdotal”. Additionally, in this case, a boundary is created when fishers and scientists are not able to exchange information and knowledge due to distrust and misunderstanding. Evidence for this boundary can be found in a large body of scholarship on the interaction between fishers and scientists (and their knowledge) (e.g. Finlayson 1994; Weeks 1995; Dobbs 2000; Neis and Felt 2001; Wilson and Degnbol 2002; Haggan et al. 2007). Although this paper focuses on the boundary between fishers and scientists, an institutionalized boundary between fishers’ knowledge and policy also exists and is evidenced in legal mandates that require policy-making to be based only on the “best scientific information available” (NRC 2004). Most critically, scientific knowledge, rather than fishers’ knowledge, determines the appropriate levels of harvest to achieve legally defined conservation goals. Fishers’ knowledge, of course, does influence policy; it determines the constellation of management rules implemented to achieve the scientific-based conservation goals, particularly rules related to resource allocation. Regardless, fishers feel that their knowledge is not included in policy-making and this is illustrated by a boundary seen in Figure 1 between fisher’s knowledge and policy.

As I discuss flow across boundaries in the context of cooperative research, I refer to three processes: boundary spanning, boundary maintenance, and boundary management. Boundary spanning involves flow across the boundary, such as the exchange of ideas or knowledge or non-scientists in roles that scientists typically perform. Boundary maintenance includes processes that reaffirm or maintain the boundary, such as those that limit or discourage communication or where only scientists determine what does or does not count as valid information for policy-making. Finally, boundary management is viewed as a more complex process that involves flow across the boundary while still maintaining a distinction between scientists and fishers (and their knowledge), or a combination of maintenance and spanning. The concept of boundary management is borrowed from Guston (2001) and others who discuss the critical role of boundary institutions as managing the boundary between science and policy. I find that cooperative research allows for flow of knowledge from fishers to science, which often involves the translation of fishers’ knowledge into scientific knowledge through being proven, verified, etc. Knowledge and expertise also flow from scientists to fishers, where fishers gain understanding of scientific research and concepts. I suggest that this two-way knowledge flow between fishers and scientists can improve commons management through increased communication and trust and capacity building, which hopefully can improve mutual understanding necessary for collective action. Finally, boundary spanners are identified as being critical to successful knowledge exchange in cooperative research.

2. Methods

This paper draws on extensive ethnographic research conducted in the Northeast U.S. from June 2003 to November 2006. This research was conducted as part of a larger study of the role of fishermen’s experience-based knowledge in the science policy process. Part of that study focused on cooperative research in the large-scale groundfish and squid fisheries in New England and the Mid-Atlantic. This study paid close attention to the interactions of fishermen and scientists (and their knowledge) and identified cooperative research as a significant site of exchange. Research consisted primarily of semi-structured and informal interviews, discussions with key stakeholders and other participants, direct observations of the science policy process, and a review of relevant documents. A total of forty-five formal semi-structured interviews were conducted with members of the fishing industry and scientists (government and non- government). This total also includes nine present or former fishery management council members and four scientists that administered cooperative research programs. These interviews were audio recorded and transcribed. Additionally, countless informal interviews and discussions occurred with fishermen, scientists, and managers throughout the research. These interviews and discussions were not audio recorded but were recorded in extensive field notes. Sixty, multiple-day meetings, including those of the regional fishery management councils, federal scientific stock assessment working groups and review panels, fishing industry forums, and cooperative research planning committees. In addition, a review of key documents were also informative to this research, including final research reports and media articles (particularly those from Commercial Fisheries News) related to cooperative research in this region. These provided additional insights into the various roles that different participants performed in cooperative research. Verbatim transcripts from the New England Fishery Management Council and the Mid-Atlantic Fishery Management Council were also reviewed. Semi-structured interview transcripts, field notes from meetings and informal interviews and discussion, and excerpts of relevant fishery management council meetings were entered into a QSR-N6 database for storage and qualitative analysis. Qualitative analysis occurred through the careful coding of the data. In this approach, understanding (or the discovery of patterns) emerged inductively through the close reading of texts (i.e. interviews, transcripts, and field notes). The results and discussion presented below are one outcome of this analysis.

3. Results: flow between fishers and scientists

3.1. Fishers to scientists: communication and translation

On one level, cooperative research is about communicating fishers’ knowledge and observations to scientists. Even more, it is about the translation of fishers’ knowledge into scientific knowledge for use in policy-making. In the Northeast U.S. there are many examples of this interaction as fishers are now normally involved “from start to finish” in cooperative research, as shown in Figure 2. In this way, cooperative research goes beyond past collaborations where fishers’ involvement in scientific research involved merely having them provide chartered research platforms for scientists.

Fishers’ knowledge flows to scientists when fishers contribute their knowledge throughout the research process (Figure 2). A critical site of flow from fishers to scientists is the involvement of fishers in the generation of research questions or hypotheses. Many cooperative research projects in the northeast U.S. are based on hypotheses or questions identified by fishers based on their experience or knowledge, including their experience with marine ecosystems and institutional environments (i.e. management arena). For example, fishers first hypothesized that using the Nordmore grate, a gear selectivity device used in the shrimp (Pandalus borealis) fishery, would allow them to catch whiting (Merluccius bilinearis) without catching depleted stocks of groundfish.

Fishers also contribute to the design of research through the selection of gear used in data collection. This is most obvious in gear selectivity research (Glass 2000), but is also critical in the design of resource assessment surveys that are conducted on industry vessels, known as industry-based surveys. The Trawl Survey Advisory Panel, an industry-science advisory panel that collaborated to design a new gear package for the government survey for the new research vessel, the R/V Henry Bigelow, is an excellent example of utilizing fishers’ expertise in the design of the gear used as the basis of research (Johnson 2007). In several industry-based surveys, fishers also identified the location of the fixed stations that were part of the research sampling program.

Fishers are active contributors to data collection. For example, fishers were instrumental in the tagging of more than 100,000 Atlantic cod in the Northeast Regional Cod Tagging program, and fishers remain critical for collecting the data through the reporting of recaptures (Tallack 2006). Another example of fishers collecting data is the Illex Squid Real-time Data Collection effort where squid fishers have voluntarily reported their catch and effort data electronically in real-time (Hendrickson et al. 2003).

In some cases, fishers also provide insight into how to interpret the data or results, although their role in this aspect of research is typically minimal. A prominent example is the Trawl Survey Advisory Panel, where fishers helped interpret the results of sea-trials of the proposed research gear and recommended modifications for further testing (Johnson 2007). In many cases, typically when the venues are more stakeholder-based and/or management oriented, fishers present findings from cooperative research. For example, industry participants in cooperative research testified alongside scientists at regional fishery management council meetings in order to get the Maine grated whiting fishery approved as an exempted fishery (Plante 2003). Fishers also commonly present cooperative research results at local industry meetings, such as the annual Maine Fisherman’s Forum.

As fishers contribute to these important stages of research, fishers’ knowledge is spanning the boundary: it is being shared with scientists and integrated into science. In these examples, the kind of fishers’ knowledge most often shared is “technical” knowledge about gear design and deployment and vessel operations. Fishers also provide “ecological” knowledge related to how different gear interacts with different environments (e.g. bottom types, currents, depths). Other “ecological” knowledge includes where and when to catch fish; i.e. the spatial and temporal distribution and abundance of fish and fish movement/behavior patterns.

There are many examples where scientists gained knowledge from fishers during cooperative research that they would not otherwise be able to access. One fisher explained how he and his crew shared their knowledge with scientists:

“They hired my boat to sample the deepest spots in the Gulf of Maine. We went out and did the work…When we were all done, I said to the guy, ‘Don’t you want to go to the deepest spot in the Gulf of Maine?’ They had said they already plotted it out, and I said, ‘No you didn’t.’ [We told them that there] was a place that we knew wasn’t on the chart. He just couldn’t get over the fact that these guys are out here and these guys know what is going on.”

Another account of scientists learning from fishers came from a scientist involved in a large-scale fish tagging effort. When asked if he had learned anything, he responded:

“Some more subtle things; I mean, you can, having done two years with a rod and reel you can tell when it’s a cod and you can tell when it’s a dogfish. It just makes you aware…And this is one of the things that has [sic] interested me about this. There are subtleties to a perception of the natural environment that run a lot deeper than you think…Dozens of complexities involved, such that you couldn’t model it, many of these situations. Yet fishers are aware of them; remarkable level of processing. And you realize that that comes from just thousands of repetitions, essentially individual data points.”

Several informants suggested that one of the most important outcomes of cooperative research is that scientists learn the complexities that fishers deal with, which characterizes their knowledge. In one fisher’s words:

“I think scientists are getting to know an appreciation of what we really go through. I think scientists are feeling the appreciation that everything is not black and white. There is not necessarily a textbook definition to what happens and what doesn’t happen. I think the understanding that there are a lot more variables in fishing than science can account for.”

Through cooperative research, fishers’ knowledge is made scientific through a process that Holm (2003) refers to as translation or purification. That is, fishers’ knowledge is verified, aggregated, or otherwise made “scientific” for use in policy-making. Cooperative research results, unlike fishers’ knowledge, are scientific, and as such are able to legally count as the “best scientific information available” for use in policy-making. Critics have noted that the translation process may change the nature of fishers’ knowledge (Agrawal 1995) and continues to privilege scientific knowledge over fishers’ knowledge (Holm 2003). Although this may be true in cooperative research, it is not trivial that fishers are now participants in the translation of their knowledge into science. In my research, I have seen fishers often exhibit a high level of satisfaction from proving their knowledge through science. One fisher interviewed defined cooperative research as “validating of fishers’ knowledge” that otherwise is dismissed, and thus views the process as giving their knowledge power. Like the scientific knowledge that scientists produce alone, cooperative research becomes credible for use in policy-making through peer review. In my research, I found that fishers appear to respect the peer review process as a valid mechanism for ensuring legitimate and credible knowledge for policy. For example, they were instrumental in generating independent scientific reviews of government science in the form of National Research Council reviews (NRC 1998, 2000) and other independent peer reviews (e.g. Payne 2003). In my observations, I have heard fishers calling for peer reviews of cooperative research results as a way to expedite their utilization in management. In this way, fishers also maintain the boundary between science and fishers’ knowledge, but with the intent on ensuring that the results of cooperative research (which contains their knowledge) get categorized on the other side, the science side, so that it is used in science and management. As noted, only knowledge on the science side can count as the “best scientific information available” for use in policy-making.

3.2. Science to fishers: learning and capacity building

Fishers often emphasize that by working with scientists they gain interesting insights about the resources they harvest. This, they said, came from working together with scientists, observing them and listening to them, an opportunity they rarely had in the past. The following two quotes by two different fishers illustrate examples of this learning:

“I’ve been shrimping for 20 years, and I didn’t know the names of some of the different species of shrimp. He laid them out on the table and showed us what they were. He showed us some of the shrimp had a shell disease. We noticed it but we didn’t know what it was. I learned a ton!”

“I’m learning, there’s no question…. I’m getting a more complete understanding of what’s going on. Not just a one side understanding, or a single person’s perspective…I’m learning more about biology, fish biology, I’m learning more about the behaviors….”

Interviewed scientists cite the fact that fishers have gained understanding of some of the fundamental aspects of experimental design as one of the most important outcomes of cooperative research. As one example, in the past many fishers did not understand the scientific concept of “stratified random sampling” used for resource surveys, often questioning why scientists towed in places where fishers already knew that fish were unlikely to be found (Johnson and van Densen 2007). They also questioned the “standardization” of the survey (i.e. that vessel and gear configurations be consistent since the 1960s), which many felt caused it to be very inefficient at catching certain species (Johnson 2007). As an example, one fisher complained about the scientists: “They fish in places where there isn’t fish. They tow too fast and they don’t have ground gear…They’re lucky to get any fish” (Stevens 2001). On the other hand, now many fishers understand why stations must be random and surveys standardized. This was apparent from conversations with fishers involved in the cooperative industry-based surveys, such as the Maine-New Hampshire Inshore Trawl Survey (Sowles et al. 2002), the Cod Industry-based survey (Hoffman et al. 2006), and the Mid-Atlantic Supplemental Finfish Survey (King and Powell 2007). Fishers on the Trawl Survey Advisory Panel deliberately balanced the need for more technologically efficient gear with the recognition that the survey must be consistent (standardized) and that catching too many fish was antithetical to the goal (Johnson 2007).

A prominent government stock assessment scientist emphasized this outcome of cooperative research:

“A fisherman will say, ‘What…do you need to look at that for? Well, you know, what difference does that make?’ We get those kinds of questions. For instance, with our trawl survey, ‘Why…do you keep using that old rag of a net and why…do you keep using that piece of junk ship?’ Well, there’s a reason for it and then when they go out and try and do some experimental work or some cooperative research or do their own survey, they’re like, ‘Oh yeah, now I know why you don’t just go out and catch as many fish as you can to get an index of abundance.’ Or. ‘Now I know why you don’t change boats.’ Or ‘Now I know why you don’t use a millionaire net to catch everything possible that you can catch.’ ”

Another government scientist reiterated the same view:

“We now have fishers talking to our people saying, ‘You guys have to pay attention cause we know what you’re trying to do and we’re a little bit concerned because on this survey; they are different people all the time guys…’ When you have a fisherman who tells you, ‘I’m a fisher and I don’t know statistics but I’ve had different groups out the last three or four times and I’m not sure they’re all subsampling the fish.’ When I’m at that point that’s a huge, huge improvement that’s been made in these three years. I mean really it’s remarkable. When we’ve got fishers using electronic logbooks and now coming back to us and saying, ‘Okay, but can you add this…. Can you give me better depth? Can you give me better temperature? And how quick can I see this stuff?’ Then we’ve gone from being you give us the information we’ll go off here, we’ll do our work, we’ll tell you what the answer is’ to “What do you think about this?” And that is I think…dramatic….”

In the statement above, the assessment scientist not only acknowledges that he has observed fishers’ learning about science, but also reiterates how this facilitates future knowledge exchange. The scientist believes that fishers, once they understand the utility of the data that the cooperative research effort is producing and how it is used, are encouraged to share more information with scientists.

Knowledge and skills regarding data collection is another example of how fishers have learned. For example, in the Northeast regional cod tagging program fishers learned about the scientific research process. They not only learned how to tag and release fish but also how to measure fish, take samples, and record data. They learned to appreciate the scientific need for standardization as well as the importance of reporting tag returns (data). Similar learning has taken place with the fishers involved in the industry-based surveys and gear selectivity studies.

Fishers are also increasingly seen participating in discussions regarding stock assessment. For example, at the 2005 Illex squid (Illexillecebrosus) stock assessment working group and the independent peer review (NEFSC 2006), fishers participated in the discussion and shared their insights with scientists in order to improve the assessment. The fact that fishers understand some of these scientific concepts illustrates flow of expertise across the boundary from scientist to fishers (i.e. capacity building). Fishers have gained a better understanding of how data are used in the assessments, even if they would not be expected to know the underlying complex mathematical equations and statistics. In this way, they have gained what Collins and Evans (2007) refer to as “interactional expertise” (Johnson forthcoming).

In addition to what fishers have learned through participation in the research process, many fishers have benefited from outreach and educational programs designed to facilitate engagement in cooperative research. One example of this is the Marine Research Education Project (Daigle 2003) where fishers and fisheries management professionals are immersed “in workshops focused on how scientific information is gathered and interpreted and how the regulatory and management process works” (Daigle 2003). From this, fishers gain a greater understanding of scientific methods through interactions with scientists. According to one fisher, a critical benefit of the program was a better understanding of the role of scientists:

“The scientists themselves understand there’s a lot they don’t know, but the way the regulations are written doesn’t give them any leeway…They gave us more information on exactly how they interpret the information they get when judging the size of fish stocks,” said Page. “I don’t necessarily agree with how they go about it, but it gave me a better understanding of their perspective” (Daigle 2003).

A government assessment scientist I interviewed further emphasized the importance of learning:

“It’s been hard to get there. But I’m now starting to see and I think a lot of people are starting to see the fishers, their interest, but their really detailed level of, ‘Well let me look at the data. Are you sure that you have to do tow by tow comparisons? How about us trying this?’ And that really gets you to a point where you’ve got multidisciplinary teams working to solve a problem. You know, and that’s really…I think that’s what we need because we’re facing some big, big challenges trying to recover some of these stocks of fish.”

In response to a question about whether learning in cooperative research was important, one fisher and former fishery management council member responded:

“Extremely important. Fishers need to understand the science behind management and assessments. Too many of them don’t. Too many of them don’t want to. But they need to have some kind of understanding of it so that they can understand and have the confidence in the assessments. It’s easy to say…to throw stones…because you don’t know what the science assessment has been going through, what they have been analyzing, how much information they have been collecting, how they have been collecting it, and how it works as pure science.”

Through their involvement in cooperative research, fishers and scientists have each learned how to cooperate with each other. One fisher explains that this capacity building was difficult:

“But the initial step, the initial process…the beginning of the initial process is a little cumbersome until you are familiar with the other side. You are just uncomfortable dealing with it. And there was…a good reluctance on both people’s part to accept the other sides’ concerns and views, and so forth. And that has gotten a lot better. And the reason it has got better is because people recognize on both sides that it is beneficial to them to have this cooperative working relationship. Both ways, both ways.”

Reflecting on the importance of cooperative research and learning, a prominent federal government scientist explained how communication and learning can reduce conflict:

“They probably still don’t agree with it [the survey]. They might not understand the whole technical aspects…That’s fine. We can have those kinds of disagreements, but as long as they’re disagreeing on facts. We both have the same facts so I don’t mind when they disagree on the basis of that, but when they’re screaming at us and they are really ignorant of what we are doing out there, then that causes me problems. Once you get beyond ignorance, you learn what each other do, what our motives are and you can respect and understand them better.”

3.3. Boundary maintenance: checks to the flow between science and industry

Efforts to ensure that the process and outcome remain scientific – with the view that those who determine what this means should be professional scientists – are prime examples of boundary maintenance, which can impede or narrow flow between fishers and scientists. For example, most research projects that are funded through these various cooperative research programs first undergo a proposal review process to determine if the research is truly scientific, as well as whether the project is likely to produce results deemed valuable. Typically, scientists within and/or outside of the federal government science agency review project proposals for technical merit. This functions as way to maintain the boundary between what does and what does not count as science. Importantly, those who determine this are scientific peers, not fishers. A related example is that final reports (and/or data) undergo a scientific review process to ensure the scientific credibility of the results produced. Typically, at least one federal government scientist will review the final report before it is accepted, and in many cases at least one other “outside” scientist reviews it as well. Again, this peer review process is conducted by professional scientists rather than non-scientific experts. Related to this, of course, are legal requirements such as the Data Quality Control Act of 2000 and National Standard 2 allowing only the best available scientific information for decision-making that make scientific peer review necessary.

Another often cited example of boundary maintenance is that most often fishers do not have access to raw data, much to the disappointment of most fishers involved. The industry’s desire to have equal access to the data is rooted in their long history of distrust of scientists. In some cases “data ownership” has been a sticking point for cooperative research. Fishers feel they must be included in this phase of research, and when excluded they question the legitimacy of cooperative research. One Maine fisher explained how important sharing the data is to the industry. He was planning to participate in a cooperative research effort, but when told that he would not have access to the raw data, he backed out of the effort. In his words:

“We wanted to be able to get the data, and in the scoping process, we were absolutely told that ‘No, we couldn’t get the data.’ Again that’s my definition [of the] difference between ‘collaborative’ and ‘cooperative.’ I was going to cooperate on a program that I didn’t have any say in the design? The government was telling me what I was going to do…It’s backwards! I think that they are missing a huge opportunity…the whole idea of the people sharing the data was what was supposed to be broken down.”

Scientists argue that sharing the raw data is problematic from a data quality control perspective. They would like to review the raw data for human error and outliers to ensure that data considered for management are of the highest quality. In particular, federal scientists view this as their responsibility under the Data Quality and Information Act of 2000. Nevertheless, the reluctance to share raw data implies that fishers are not viewed as having the appropriate expertise to judge whether the data from research are credible, at least from a scientific standpoint. The outcome of not sharing data is that some fishers do not participate in research, which creates a barrier to flow and maintains the boundary between fishers and scientists. At its core, this reflects the importance of trust. When fishers do not trust scientists, such as when they are denied involvement or access to the data, flow between fishers and scientists does not occur.

3.4. Boundary management: balanced flow

There are several important instances where a balanced flow is seen between fishers and scientists, where the boundary is managed in ways that allow for fishers to participate in the production of knowledge while ensuring that the information provided meets the legal data quality standards for use in policy-making. For example, in some cases non-scientists participate in the review of proposals and/or the final reports. In the cooperative programs examined in this research, the proposals are reviewed in a competitive process. Funds are limited such that not every proposed project will be funded. Most funding programs operate with a review panel or committee that reads, ranks, and recommends which projects the program will fund. These panels or committees typically include industry participation. The programs have a list of guidelines or criteria that are used to judge the proposals. There is a conflict of interest requirement that prohibits anyone who is involved in any of the projects being funded from participating in the proposal review. An example of this is the Northeast Consortium’s Advisory Committee which reviews proposals and offers advice, regarding which projects are funded, a process that includes fishers (Hartley and Robertson 2006). Involving fishers in the proposal review process means that they contribute, even if indirectly, in the selection of which projects are funded.

Similarly, in New England stakeholders are also involved in the review of final reports, through the New England Fishery Management Council’s Research Steering Committee (RSC). This committee comprised of council members, industry experts, and scientists not only identifies the research priorities identified in Request for Proposals, but it also reviews both the submitted research proposals and the final reports. The research proposals receive a technical review and then the RSC reviews and then ranks them for funding, with final decision resting with the government agency. In this process, the final reports are forwarded to the National Marine Fisheries Service (NMFS) for technical review and then the RSC reviews these to (a) determine that they have had appropriate technical review and (b) provide any additional “value added” comments to direct the results for use in management. The RSC then forwards their findings as appropriate, such as to Plan Development Teams (PDT) or species committees, and a summary is reported to the full fishery management council. It aims to be transparent and independent, while allowing multiple stakeholders to participate in the process. Whereas traditional peer review typically involves only certified scientists, this process also includes relevant non-science stakeholders that bring their expertise, or “extended facts,” to the process. In many ways, this functions as an “extended peer review” process (Funtowicz and Ravetz 1993; Ravetz 2004). Extended peer review processes are expected to improve the legitimacy, credibility, and relevance of science, especially in the context of post-normal science, where facts are uncertain, values in dispute, stakes high, and decisions urgent (Funtowicz and Ravetz 1993; Ravetz 2004).

4. Discussion and conclusion

We see that cooperative research allows for a two-way flow of knowledge between fishers and scientists (i.e. boundary spanning). On the one hand, fishers’ knowledge flows to scientists as fishers participate throughout the research process (Figure 2). Fishers’ knowledge tends to be produced at a more local scale than fisheries scientists’ research-based knowledge. Cooperative research makes fishers’ local knowledge more relevant or usable in the science policy process either by making it fit the requirements of the scientific method or by aggregating it to a scale more compatible for science-based management. In this way, scientific-based knowledge remains privileged compared with fishers’ knowledge, which continues to need refinement and translation before being utilized in the science policy process. This is necessary if it is to be utilized in the science policy process, as being “scientific” engenders credibility and legitimacy, while also meeting legal data quality requirements.

Cooperative research is a better solution than relying solely on either scientists or fishers to produce knowledge for policy. On one hand, overreliance on scientific knowledge may ignore crucial local processes and phenomenon, and may not engender buy-in or compliance to management. The Canadian cod crisis illustrates a potential outcome of dismissing fishers’ knowledge in favor of scientific knowledge (Finlayson 1994). On the other hand, overreliance on fishers’ knowledge is also a concern. As discussed earlier, one interpretation of the New England groundfish crisis is that it is an outcome of dismissing scientific knowledge (Boreman et al. 1997). The linking of fishers’ local, qualitative knowledge with scientists’ large-scale, quantitative knowledge is what Wilson (2007) suggests is necessary for managing multi-scale complex marine ecosystems. Indeed, cooperative research produces information that, although still a form of scientific knowledge, utilizes fishers’ local ecological knowledge, while making use of the benefits of science as a source of consensus. The key is that cooperative research maintains those standards accepted by all parties as producing legitimate and credible knowledge, such as through both conventional and extended peer review processes. It is the balance between utilizing both fishers’ and scientists’ knowledge and expertise that occurs in cooperative research that enables effective management of the boundary.

Knowledge or expertise also flows from scientists to fishers, which I consider to be part of the capacity building that occurs in cooperative research. By engaging with scientists in cooperative research, fishers learn about science, including scientific data collection procedures and how data are used in assessments and management. They also learn the meaning of scientific concepts such as standardization, stratification, and sub-sampling. This interactional expertise (Collins and Evans 2007) related to the scientific research allows fishers to communicate and translate their knowledge to scientists. As an example, this capacity building, or sharing of expertise from scientists to fishers, was seen when several fishers involved in the Illex squid real-time data collection effort were able to communicate in the stock assessment process, in part because of the learning that occurred in the cooperative research effort (Johnson forthcoming). The key is that they are able to use their “interactional expertise” gained through cooperative research to communicate their knowledge and expertise with scientists.

Improvement in communication across the boundary between scientists and fishers is an important outcome of cooperative fisheries research (Johnson and van Densen 2007). In addition to the ability to communicate due to fishers gaining interactional expertise, communication also emerges as trust builds between fishers and scientists. For example, one fisher explained how cooperative research has improved communication between fishers and scientists:

“Years ago we’d walk into a room and you’d go to a fishery management…meeting, scientists would sit on one side and fishers would sit on the other side and nobody would talk to each other. It was like nobody wanted to give up trade secrets. Now people are starting to have lunch together, talk to each other. The scientists are talking to fishers to get the fishers’ view. The fishers are getting scientist’s view. So I think we’re on the right road; it’s getting better.”

Critical to exchange in many cooperative research efforts are the “boundary spanners” who are able to communicate on both sides of the boundary (Johnson 2007, forthcoming). In some cases, they are scientists with “abstract/generalizable” contributory expertise (Carolan 2006) who recognize the value of fishers and their knowledge and are able to communicate with non-scientists (i.e. they have “interactional expertise”). They are also able to communicate the industry’s concerns to other scientists. This is due in part to language proficiency; they know the scientific terminology that is often unfamiliar to fishers. In this way, these scientists perform critical boundary spanning functions. Fishers can also be boundary spanners, such as key fishers who promote cooperative research to other fishers. These fishers have “local/practical” contributory expertise (Carolan 2006) about the fisheries and environments with which they interact, as well as interactional expertise related to science and management that allows them to communicate with scientists. Some are members of the regional fishery management council or leaders in cooperative research, and as such they are well informed about the complex issues of management and science (i.e. they have “interactional expertise” with which to interact between industry and science). Given their critical roles in facilitating boundary spanning and boundary management, greater understanding of the social characteristics of these individuals is needed.

Cooperative research can also improve the management process. Poor industry-science relationships and distrust have emerged from the lack of communication and differential understandings of resource conditions. This has led many fishers to suspect that scientists and government officials were conspiring against them, even going as far as to suggest that scientists were making up facts. At the same time, the lack of communication left scientists thinking that fishers were dismissing advice simply because they did not like its consequences. Conflict between these groups emerged from this distrust and the result was that collective action was not possible (i.e. managers could not convince fishers to reduce fishing) (Dobbs 2000). By making the process more transparent, cooperative research is considered a “mechanism to renew trust and good faith in the management process” (Kaplan and McCay 2004). One fisher interviewed in this research alluded to the importance of transparency for generating trust in research results:

“I think [with] a lot of us, the results have not been necessarily what we expected going in. But I’ve been there; I’ve watched how it was done…Even though I might not necessarily like the results, I have to support them because I was there. When I hear that from other people, that were involved in other projects, then I have to believe what they have to say.”

Mutual understanding of resource conditions can increase the likelihood that stakeholders will agree upon common rules for resource management (Ostrom 2005). Thus, cooperative research offers an opportunity to improve commons management by reconciling differential pictures of nature that are due in part to the different scales at which knowledge is produced. By working together in cooperative research, fishers and scientists (and managers) can work towards a shared construction of the resource and management needs. The flow of fishers’ knowledge to scientists is critical to this since it offers insight at a scale typically not viewed in traditional data collection programs. In theory, cooperative research resolves conflict by merging two perspectives into one shared view, through a process of translation (i.e. fishers’ knowledge is translated into science through cooperative research). Ideally, fishers, scientists, and managers can agree on the “facts” and policy-discussion can focus on what to do with those facts. The collaborations related to improving the government’s bottom trawl survey are examples of fishers trying to make the government’s construction of nature match their own experiences. Scientists similarly collaborate to improve the accuracy and precision of their survey and assessments, in part to increase the credibility of their expertise. We do not know yet if closer constructions will result from these collaborations, but that is certainly the intention.