Towards an eclectic theory of the internet commons

Recent developments in communications technologies and, in particular, the advent of the Internet and its widespread applications in all spheres of human activity, have posed a serious challenge to the mainstream neo-institutional theory of the commons (common goods) and, especially, common pool resources theory (CPR). Although the term ‘new commons’ has been coined to describe a new area of study, previous attempts to analyze Internet goods within the framework of CPR theory have not been successful, as they have not been able to capture the important new characteristics of the Internet commons. Based upon an empirical analysis of over 20 Internet goods, the current author proves that, basically, Internet goods do not fall within the common pool category of goods. In addition to the key characteristics used so far within mainstream theory - excludability and substractability, other attributes such as sharing potential, joint use in production rather than in consumption only, and non-hierarchical governance of production definitely are relevant, and should be included in any analysis of Internet commons. 
 
This analysis has proven that the neoinstitutional approach retains its explanatory power with respect to the Internet commons, especially by emphasizing the path-dependent evolution of the Internet and the role of informal and formal rules shaping its operating environment. At the same time, it does not capture the direct impact of major breakthroughs in information and telecommunication technologies (ICT) on the Internet commons. 
 
The proposed eclectic framework, one which addresses the specific features of Internet goods, allows to grasp the full complexity of the Internet commons. It integrates new concepts developed in various disciplines of social sciences (economics, sociology, history, anthropology) with the mainstream theory of the commons, developed from the neo-institutional perspective. Among those new concepts and theories, the most important are general purpose technologies (GPT), network externalities, positive free riding, the concept of shareable goods, the architecture of participation, peer production, and the gift economy.


I. INTRODUCTION
'Is There Anything New Under the Sun ...?' was the title of a paper by Charlotte Hess, presented at the Eighth Biennial Conference of the International Association for the Study of Common Property, held in 2000 in Bloomington, Indiana (Hess, 2000). She raised this provocative question with respect to the new phenomenon -technology-driven, human made commons, including the Internet. Based upon her review of the rapidly growing body of literature on this subject, she pointed out certain apparent weaknesses in current research. Many contributions were lacking methodological rigor, particularly reflected in ambiguity as to the meaning of 'commons', which made it very difficult to position the growing literature on the 'new commons' relative to mainstream commons theory, predominantly focusing on common-pool resources (CPRs). In conclusion, Hess emphasized the need to promote high-quality research, particularly on the Internet commons, which can overcome some of the weaknesses recognized at the initial stage.
This paper attempts to contribute to the current theoretical debate on the Internet commons, by addressing the following research questions: 1. To what extent do the Internet commons demonstrate the two principal characteristics (bio-physical attributes) that have been utilized widely in the analysis of traditional commons; specifically, excludability and joint consumption? 2. Which new intrinsic characteristics of the Internet commons make them a distinct category, thus justifying extension of the theoretical framework to include these characteristics?
While conducting such analysis, one cannot avoid references to the neoinstitutional framework, which is the principal platform which has been used in the construction of the mainstream theory of commons 2 . The neoinstitutional approach has been attractive for the analysis of commons for two reasons. First, the core argument that 'institutions matter' was particularly relevant to CPR goods, where the classic market mechanism cannot function in an efficient way. Second, neoinstitutionalism offers a powerful base for promoting proactive analysis and research, leading to the design of specific institutional arrangements, thereby enabling the more efficient production and consumption of particular CPRs. I refer, for example, to the Institutional Analysis and Development (IAD) framework. Consequently, I also have included a third research question, which is: 3. Does the neoinstitutional approach retain its explanatory strength and usefulness in the investigation of the Internet commons?
As an initial step, I had to resolve some key definitional issues, in order to avoid the risk that the analysis becomes muddled. Unlike classic CPRs, the Internet is not a single good, but rather a 'composite' of many goods within each of its three basic layers: hardware, software, and information content. In order to draw some general conclusions, I will investigate, in greater detail, the principal characteristics of individual Internet goods. In this analysis, I will adopt the definition of commons that has prevailed in the CPR literature, which emphasizes the crucial role of the key biophysical attributes of goods: nonexcludability and non-rival consumption, differentiating common goods from private goods. It is because of these characteristics that the classic market mechanism for the production and exchange of common goods largely becomes inefficient. The tension between individual and group rationality has been exemplified in the formulation of social dilemmas -the tragedy of commons, the prisoner's dilemma, and the logic of collective action (Olson, 1965) being the key conceptual pillars of the mainstream theory of commons.
Although the categorization of various goods using the two characteristics mentioned above has been initiated by economists in the context of the public goods debate (Samuelson, 1954), its refined formulation by E. Ostrom and V. Ostrom has laid down the foundation for the theory of commons (Ostrom, Ostrom, 1977) in social sciences. By the same token, in the forthcoming analysis, I have decided not to refer to the broader meaning of commons as a 'public space' -originally a town square, village green, etc. This does not mean that I consider it less important. Just the opposite, I believe that the Internet adds a fundamentally new dimension to the concept of commons as a public space. However, its detailed analysis would call for addressing a different set of research questions and using different tools and methodologies, which is beyond the scope of this paper.
The mainstream theory of commons largely has concentrated on one category of goods -common-pool resources (CPRs) -in which case, the difficulty in excluding unauthorized beneficiaries coincides with rival consumption after reaching a certain degree of exploitation of a given resource. Other non-private goods with alternative combinations of key biophysical attributes -like pure public goods and so-called 'club goods' (toll goods) -have received much less attention from scholars. In my analysis of Internet goods, I adopt a broader framework, by including all combinations of excludability and substractability. I believe that this becomes indispensible, as in the case of the Internet, where I investigate a conglomerate of goods and the investigation of internal diversification becomes the key objective of analysis. This, in turn, calls for adopting a broader definition of commons, which, in addition to CPRs, also includes pure public goods and club goods 3 .
The rest of the paper is organized as follows. First, I look at the rise and evolution of the Internet from the neoinstitutional perspective -testing general explanatory strengths and the applicability of its specific concepts and methodologies. Next, I conduct an empirical analysis of various goods within the Internet domain, focusing on the key characteristics (non-excludability and non-rival consumption). Identifying new and important characteristics of the Internet commons that are not adequately captured within the mainstream theory of commons, in the context of Internet goods, leads to the proposal for an eclectic framework, wherein new concepts from other disciplines are incorporated, thereby broadening the research perspective. Key findings are summarized in the final section.

II. INTERNET AND THE NEOINSTITUTIONAL APPROACH
Although the origins of the neoinstitutional approach in economics are reflected in the early work of Ronald Coase (Coase, 1937), the development of the methodological framework and the analytical apparatus, as well as the extension of the scope of analysis to include other branches of the social sciences -like sociology, law, and political science -is of recent origin, all having taken place since 1960. This coincided with the rise of the Internet -a new phenomenon which, despite a very short history of less than 50 years, has brought about fundamental changes in practically all aspects of human lifeeconomic, social, cultural, educational, political, etc. (Abbate, 1999). This short history represents a major limitation with respect to the full-fledged application of neoinstitutional methodology and its analytical tools. While emphasizing the crucial role of institutions, particularly the informal rules (customs, norms, values), the advocates of the neoinstitutional approach point to the long-term process of shaping such institutional environments (Williamson, 2000). In the case of the Internet, neoinstitutional analysis may, in fact, reflect an ongoing process of shaping the relevant institutions. This represents a major limitation as to the scope and depth of the analysis. Consequently the findings and conclusions drawn from such analysis must be evaluated with caution.

Path dependency
Path dependency is one of the key concepts of neoinstitutional analysis. It reflects the situation wherein outcomes are "shaped by a previous sequence of decisions" (Ostrom, 2007: 351). These decision sequences can be partially incidental; on the other hand, they may result in enduring patterns, procedures and standards. The persistence of the QWERTY keyboard, despite the availability of ergonomically superior alternatives, is a classic example of the path dependent phenomenon. During the short history of the Internet and particularly its formation period in the 1960s, there were numerous examples of path-dependent sequences of events.
Let us examine military traditions relating to the Internet first. Although the initial impulse and funding came from the government military sector, from the very beginning, members of the academic community have enjoyed great freedom while initiation projects. They have fully exploited such freedom while shaping principles and operating rules which became pillars of the Internet and which largely contradicted military traditions. Examples of this are: • The lack of a central command (co-ordination) unit. Instead, a democratic procedure based on consensus has been introduced to define detailed operational procedures; • The principle of network neutrality, which implied that the Internet network, as such, shall be as simple as possible, whereas all specialized elements (network intelligence) shall reside at the level of user-end stations; • The open access principle reflected in the determined effort by the Internet founders to establish a technical infrastructure allowing local networks, both in the United States and abroad, to join the emerging global Internet structure. It is worth emphasizing that such developments took place during the "Cold War, "a period of outright hostility between the Western world and the Soviet bloc, resulting in severe restrictions in cross border movements, severely blocking the transfer of technologies with potential military applications." In the short history of the Internet, one can point out an interesting demonstration of path-dependency resembling the classic QWERTY case (Leibowitz, Margolis 1990). Here, we have in mind the system of allocating domain addresses to the end-users based on a 'first come, first served principle'. Although there are obvious deficiencies with such an allocation mechanism combined with the oversimplified categorization of domain addresses, this has remained basically unchanged, as accumulated experience and tradition made changes difficult to implement.
Finally, one cannot neglect the path-dependent outcomes of the initiatives taken by Bill Gates in the mid-1970s. At that time, software programs and operating systems had been developed with the aim of enhancing efficiency and compatibility between large mainframe computers. Thus, software had not been perceived as a separate product, but rather as an auxiliary application, making the purchase of hardware more attractive. Also, making the source code freely available was quite natural, as it facilitated the elimination of certain flaws and the configuration of computers to satisfy individual needs. In 1976, then 20-year old Bill Gates, at that time a renowned IT specialist and nascent entrepreneur, challenged the open access principle. In his 'Open Letter to Hobbyists', he laid down key arguments in favour of proprietary software, pointing to the detrimental effects of the free access principle for software providers, and spurred the development of the software industry in general (Gates, 1976).
The initiative taken by Bill Gates shall be seen as a turning point in the development of the software retail market, in that it now started to operate independent of the computer hardware market. But the success of Bill Gates and his company Microsoft as a founder and key player in the emerging software industry was not merely the result of technological and professional superiority, but also Gates' affinity for institutions -contracts, laws and regulations, introduced in order to protect software providers against the unauthorized use of their products. At the same time, Gates' initiative paved the way for consolidation of the open source movement, which initially relied exclusively on unwritten rules and practices established during the formation period of the Internet. The visible outcome has been a variety of formal institutions related to the free/open source movement.

Informal rules
The experiences of the 1960s -during the early 'formation period' of the Internet -were crucial for shaping informal rules which became the pillars of 'Internet culture'. When looking at the relationships among the pioneers at that time, networking was the key coordinating mechanism in the implementation of initial projects (see Himanen, 2001;Castells, 2003). According to Frances at al, (1991) efficient functioning of networks largely depends upon the spirit of cooperation, loyalty and trust among network members. This, in turn, coincides with the amount of social capital, defined as a set of informal values and norms shared by group members (Fukuyama, 1999:26). In the community of early Internet enthusiasts, there was a very favourable climate for building such capital and facilitating efficient network coordination. First, the people involved in the early stage projects had similar occupational backgrounds, coming mostly from academic circles. Secondly, they represented an emerging IT profession that was considered highly complex and, therefore, hermetic to outsiders. This helped to forge internal ties between project team members.
What were the most crucial informal norms and values guiding the implementation of early-stage Internet projects? One should definitely mention a decentralized philosophy, encouragement for bottom-up initiatives, respect for minority views, and strong efforts to reach consensus (accepting, however, nonunanimous voting). This may sound like an essentially egalitarian, socialist orientation typically demanding some sacrifices with respect to efficiency and quality. However, this definitely was not the case during the Internet's formation period, which was strongly influenced by the cult of professionalism, based on merit and not on formal criteria. As a result, the best professionals were invited to join project teams, irrespective of their formal education, status, age, nationality sex or race 4 .
The mutual trust based on the accumulated social capital within the network was crucial in shaping, from the very beginning, a broader, global vision of the Internet. It was so powerful that it overshadowed the narrow orientation that initially had been pursued by the government agencies focused on building specific military applications.

The convergence of informal and formal rules
Although the rise and early development of the Internet has been governed mostly by informal rules, formal rules (laws and regulations) have emerged as the Internet space has been increasingly used for commercial activities, in order to protect the economic interests of parties involved in market transactions. The extension of existing intellectual property protection regimes, to include software and, later, information content stored electronically, can be mentioned as the primary examples of this trend. The issue of convergence (or lack thereof) between formal and informal rules represents one of the interesting subjects tackled within neoinstitutional analysis in sociology (Nee, 2005:34). Convergence is reflected in the close-coupling of formal and informal rules. However, when the formal rules are at odds with the norms and values of the network community, this gives rise to a decoupling through opposition norms.
The relevant processes of close-coupling and de-coupling through opposition norms effectively has taken place as part of the Internet's evolution. The process of commercialization of important segments of the Internet space has resulted in formalized restrictions relating to source codes and information content, thus reflecting close-coupling between said laws and regulations and the core set of capitalist values emphasizing profit motives.
Simultaneously, however, this process has clashed with the norms and values which have crystallized during the formation stage of the Internet, reflecting its free access tradition. The reaction, in the form of specific opposition norms, went far beyond the meaning of such norms in the sociology literature, defined as those enabling "networks to coordinate action to resist either passively, through slow-down or non-compliance, or in manifest defiance of formal rules and the authority of organizational leaders" (Nee, 2005:33). The advocates of 'free access' not only resisted the emerging formal infrastructure facilitating commercialization, but actively embarked on setting up some formal regulations that were 'close-coupled' with open access norms and values. What has emerged, as a result, has been a dual structure Internet, composed of commercial and free access segments. Assessing each segment separately, I observe close-coupling of informal and formal rules but, at the same time, clashes remain evident between the two segments.
3. Do institutions really matter in the case of Internet commons?
The analysis conducted so far has demonstrated that the neoinstitutional methodology and apparatus can be useful in the analysis of such a multidimensional phenomenon as the Internet. Nonetheless, let us look at the explanatory strength of the neoinstitutional approach from a broader perspective. To what extent have institutions mattered during the early period of Internet development? Which elements of the institutional infrastructure that were in place some 50 years ago facilitated its emergence? Surprisingly, it is hard to find convincing answers to those questions. The opposite causal relationship seems to be much more evident. A major breakthrough in information and telecommunication technologies (ICT), which encompass the Internet, has greatly facilitated institutional development -just to mention the functioning of networks as a coordinated mechanism of human activities in various spheres -socio-cultural, economic, professional, political, etc. -which now can operate very efficiently on a global scale.
The lively academic debate that has taken place over last 30 years, as to the importance of institutional development versus technological change as the key drivers of socio-economic development, is particularly relevant here. Presenting the key orientations and directions of this debate is beyond the scope of this paper. I shall point out a promising avenue by which to resolve the above-mentioned dilemma, which is based on the dual self-reinforcing relationship between institutions and technological progress (Nelson, 2002). A related issue is the way in which technology and technological change are being incorporated within the framework of neo-institutional analysis. Some authors (Hira, 2000) point to the apparent weaknesses in the neoinstitutional theoretical framework, which sees technology as an agent of change through the institutional proxies, rather than also examining its direct impact.
In my view, there is some merit in the above-mentioned criticism of the neoinstitutional approach. But when it comes to the major technological breakthroughs, the need for incorporating the impact of such breakthroughs in analysis, both in relation to institutional development and as an independent factor of change, becomes inevitable. The rise of the Internet and the progress in the ICT field definitely represent such major breakthroughs. Consequently, one may conclude that the neoinstitutional perspective can be useful in the analysis of Internet commons, but it may not be sufficient to explain all dimensions fully. Incorporating alternative theoretical perspectives may be necessary, as well.

III. KEY CHARACTERISTICS OF INTERNET GOODS
The Internet is a complex, heterogeneous good that can be alternatively defined accordingly to its different functions. In this paper, I shall adopt the wide definition of the Internet, viewed as a system of communication between users (See Solumn, Chung, 2003;Benkler, 2000). Similar to other systems of communication, such as language, the Internet consists of three distinct layers: a physical layer, logical layer and content layer (Benkler, 2000:68). The physical layer enables communication by providing physical equipment; the logical layer is responsible for maintaining the code; and the content layer is the final message that is transmitted from sender to recipient. This categorization constitutes the basic axis of the following analysis.
In order to determine whether the term commons is applicable to Internet goods, I shall conduct first an analysis of their key biophysical attributes: excludability and substractability, in line with the methodology adopted in the mainstream theory of commons. Additionally, I shall identify some other characteristics which differentiate Internet commons from traditional common goods. While embarking on such a task, I shall indicate some potential difficulties, as the particular attributes largely are dependent upon certain technical features of a given good. Analyzing the technical aspects of the Internet is beyond the scope of the present paper, so I will restrict references to technical aspects to a minimum (See Clark & Blumenthal, 2001;Lemley, & Lessig,2000).

The physical layer
The physical layer consists of all the physical tools and equipment that enable the process of communication. These include computers (the equipment to produce and send information), and tools that enable communication between computers -routers, wires, cables and wireless networks. Recently, a wide range of mobile devices -such as mobile phones, handhelds and mp3 playershave been added, which also facilitate communication. Following the terminology adopted in the mainstream theory of commons, the key hardware components basically are purely private goods (excludability combined with rivalry consumption). However, a closer look points to a somewhat diversified picture.

a) Telephone cables
In order to connect to the Internet, the user needs a computer connected to a telephone network, cable line or some other device that can receive a wireless signal (like a router). Cables that transmit the signal to the end user belong to telecoms or cable operators. These are private goods, which are extremely expensive due to the high costs of building and maintaining the cable infrastructure. Therefore, ownership of the cables usually belongs to big telecommunication companies, who generate profits by leasing the cables to smaller operators. The dominant and often monopolistic position of the telecoms results in very strict policies related to sharing access to cables. These relations also are monitored and controlled by specified governmental agencies. Therefore, the cable infrastructure falls into the category of regulated private goods.

b) Wireless networks
Wireless networks are an alternative source of internet connection. These networks have distinctive features, making them an interesting object of analysis. Until recently, due to the high level of radio wave interference, consumption of the radio spectrum was steeped in rivalry, thereby falling within the CPR category. This spectrum was considered to be a scarce resource that needed to be regulated by the authorities. Moreover, alternative governance structures were based either on restricted access to specialized government agencies (collective goods) or the privatization of specific frequencies through concession systems (private goods).
Recent technological developments have enabled smart receivers to distinguish signals from different sources, thereby allowing the sharing of certain frequencies (Benkler, 2006:88). Consequently, multiple users can use the same frequency with very little or no decline in the quality of service. For that reason, modern wireless networks are becoming an open alternative to closed broadband networks. In terms of key characteristics in common, I must differentiate between three broad categories of wireless network: • Commercial networks owned by telecommunication companies or private corporations, to which access is granted to subscribers or designated members. These networks fall into the club goods category; • Municipal networks built by local governments, to be freely used by citizens or visitors. Municipal networks can be classified as public goods -the costs of providing the good are covered by the local government, and access usually is free to all citizens or users in range of the network; • The most interesting category of wireless networks is comprised of open, bottom-up networks, usually set up by individuals who allow anyone to connect to their transmitters. These so called social WiFi networks or mesh networks, and they generally fall within the public goods category.
Technological developments in wireless communications also have given rise to the widely spreading concept of an 'open spectrum'. Open spectrum proponents call for decision makers to deregulate and free up the radio spectrum (Benkler, 1998). They claim that, as a result of recent technological developments, radio waves have become public goods. Consequently, they appeal to regulators to create a friendly, institutional environment, facilitating and encouraging bottom-up civic initiatives.
To summarize, wireless networks are in the midst of major changes. New institutional arrangements that allow for the transformation of classic private goods into club or public goods are particularly interesting. The informal networks serve as a primary example of the impact major technological developments in the ICT field have had on the common character of specific goods that comprise the physical layer of the Internet structure.

c) Computer hardware
These goods should be classified as classic private goods, owned by individuals or organizations. What is interesting, however, is that they can change their purely private character once the equipment is connected to the Internet. This is because personal computers hooked to the network can share excessive processing power or disk storage. Since this involves practically no additional costs to the owner, this paves the way to providing excessive processing power free of charge to perform certain socially valuable initiatives.
There are numerous examples of sharing private resources for the benefit of the community, just to mention distributed computing projects like SEITI@home and peer-to-peer networks. It should be underlined that individuals share their resources with a community of strangers, having often little or no knowledge about the way the resources will be used. One, therefore, may note certain duality with respect to private versus common character, which becomes an essential characteristic of some Internet goods.
Moreover, private goods that are jointly used within a network can become valuable resources for commercial activities. One of the most profound examples is Skype, the company that offers free VoIP calls. Skype uses the power of computers connected to the network to establish VoIP connections. When the user installs and runs Skype on his computer, he joins the community of users that form a peer-to-peer network. The network enables free internet calls (benefitting users) but also brings revenue to the private company. In the context of the analysis of Internet commons, this example points to the apparent diversity in existing patterns. Here, the private goods (computers) shared for free are being commercialized by a private company while benefitting a community of users. What is worth noting, due to the high complexity of the system, is that users often are very unaware that their computer is taking part in such an exchange.

The logical layer
The goods that constitute the middle, or logical, layer are internet protocols, technical standards and software that enable the process of communication.

a) Technical standards
Technical standards (technical specifications and protocols) define the rules of communication between computers, and are the core element of the logical layer. The choice of a technical standard not only influences the basic operations of the computer system, it also has vital implications for the hardware industry, internet providers and end-users. Abbate stresses that standards are a form of control not only over technology, but also over the user (Abbate, 1999:147). Technical standards can be either open or closed, as far as ownership and availability are concerned.
The Internet has been built on open, non-proprietary protocols, such as the TCP/IP suite of protocols, which were defined as common property of the whole internet user community (Solum, Chung, 2003:23). Tim Berners-Lee, the inventor of the World Wide Web, has deliberately kept the HTTP protocol and HTML programming language open, in order to maintain the innovative character of the network for as long as possible (Berners-Lee, 1999: 36). Such open standards are non-rival resources, open for everyone to use without access restrictions. Therefore, they should be classified as public goods, becoming the common property of Internet user community.
While the network technical standards responsible for the seamless cooperation of different networks and devices are, to the great extent, open, many technical specifications and file formats are proprietary and closed -protected by copyright or patent law. These formats are designed by companies who want to have exclusive rights to produce specific software. Among the examples of proprietary formats are the following types of files: .doc (text files), .pdf (text files), .gif (images), flash (animation), and .mp3 (music). Proprietary standards, owned by private entities fall into the category of private goods or club goods. One is not allowed to tinker with the formats without the copyright holder's consent, and access to specific tools is granted only to licensees.

b) Domain name system
Among the most important resources within the logical layer are IP addresses and domain names. These are private goods governed by the dedicated organization called the Internet Corporation for Assigned Names and Numbers (ICANN). The right to use a certain Internet domain name is granted to a person or organization who rents the domain for a limited period of time (with priority to extend that time, if wanted). The first-come, first-served rule is applicable to new and previously-unknown domain names. Otherwise, a person or organization must prove that it should be granted exclusive rights to use a certain domain.

c) Applications and software
Applications and software comprise the largest group of resources within the logical layer. Software enables computers to perform productive tasks for users -it translates content and commands from machine language into human language and vice-versa. Among software, we can list operating systems, programming tools, and applications (word processors, email applications, video software, games, etc.). Although computers can work perfectly well without an Internet connection, more and more software is designed to work within the network. Therefore, I do not make a distinction between Internet and non-Internet software.
As discussed in the previous section, in the early formation stage of the Internet, the prevailing approach was based on the principle that software should be open and non-proprietary. This all changed, due to Bill Gates' initiative back in 1976, which resulted in the division of the software market into two segments: proprietary software and free/open source software. The owner of the rights to proprietary software imposes on users' strict restrictions concerning its use, and the ability to copy and modification it. In the case of proprietary software, the source code is closed as well. Such software can be distributed either for free (Internet Explorer, Adobe Acrobat Reader) or licensed to the individual or multiple users (Microsoft Office).
On the other hand, free/open source software is characterized by free access to source codes and a wide range of freedom for the user. The majority of f/oss distributions are given away for free; but there also are versions of the software that are distributed on a commercial basis.
Computer software is an intangible good in digital form. Software code can be copied almost for free (the cost of a recordable CD is negligible) and without any loss of quality. Therefore, computer software, both non-proprietary and proprietary, belongs to the category of non-rival resources. In the case of proprietary software, the owner can limit the number of unauthorized users by selling licenses. Thus, the proprietary software falls into the category of club goods. On the other hand, open source/free software is both non-rival and difficult to exclude. Therefore free/open source software can be classified as a public good.

The content layer
The resources in the content layer of the Internet are intangible. These are human intellect products in digital form -information, ideas, knowledge, music, text, videos and images. Information is a key resource that is a basis for all the other goods in the content layer.
CPR scholars have recognized that informational resources are of a different nature than tangible, material goods. Hess and Ostrom stressed that analyzing information is much more tenuous than natural resources, because "Information, on the other hand, often has complex tangible and intangible attributes: fuzzy boundaries, a diverse community of users on local, regional, national, and international levels, and multiple layers of rule-making institutions" (Hess & Ostrom, 2003:132 It should be noted that recent technological developments, especially digitization, has redefined the whole environment of informational goods. In the 'analogue' era, artefacts were physical and mostly private goods, and the infrastructure -like libraries and archives -that made these resources available to the public either were club goods or public goods. Ideas always were a common good and remained as such within the digitalized environment.
With regard to Internet goods, examples of informational goods include various resources in digital form. Although traditional artefacts in physical form belong to the category of classic private goods, digital artefacts have different attributes and therefore represent wider range of goods. One of the most distinguished features of digital artefacts is the possibility of copying without the loss of quality and at almost no cost. This makes digital artefacts non-rivalry goods, as one person's use of a digital file does not subtract from another person's capacity to use it. Free copying changes the whole process of distribution, enabling the immediate delivery of digital artefacts. Additionally, new technologies entering the market have facilitated the collective production of informational goods, so that they have become common property. Thanks to collaborative website creators, such as Wiki, the users can edit and share information easily with others. The authors also have gained new tools for managing their copyrights. These are special copyright licenses that enable copyright holders to grant some of their rights to users while retaining other rights. Examples of new licensing models include Creative Commons licenses or GNU Free Documentation Licenses. The digitization of artefacts and the whole electronic environment for producing information has fundamentally altered their basic attributes. On one hand, their consumption have become non-rival. As for access, the existing arrangements range from full exclusion of unauthorized users to granting free access with some intermediary solutions. Consequently, digital artefacts fall within various categories of common good, depending upon the degree of exclusion.
As far as the infrastructure is concerned, one also may note a visible shift to digital forms. Digital repositories and databases typically fall within the category of club goods (they are open only to authorized users), but there is a growing number of digital open access archives that gather informational resources which are free for anyone to use. Therefore, one may conclude that, within the realm of informational goods, one can observe very interesting changes corresponding to the radical innovations being developed in information technologies, which are shaping entirely new conditions for the production and exchange of such goods.

The need for an eclectic approach
Based upon the analysis presented in the preceding sections, I will now summarize key findings in the context of the research questions formulated in the Introduction. Definitely, there is a merit in applying the notion of commons to individual Internet goods, but also to the network as a whole. At the same time, however, the Internet adds new dimensions to the overall concept of commons, so that the question raised by Charlotte Hess back in 2000 -"Is there anything new under the sun?" definitely can be answered in the affirmative.
Let us now address the issue of Internet 'novelty' in the context of the mainstream theory of commons. Definitely the key novel aspect that scholars must confront is its 'composite' structure. With respect to the two classic biophysical attributes, I have noted great diversity in Internet goods (see Figure  1). It should be stressed that the process of placing Internet goods on the map of common goods only can be indicative; we do not have the precise tools to measure the intensity of these two basic attributes (excludability and substractability).
Among Internet goods, many fall within the pure public goods category. This is  The most striking evidence of 'novelty' in the context of the mainstream CPR theory is the virtually empty 'CPR quadrant'? With the exception of traditional radio waves, it is difficult to identify any other Internet good which would fall within the CPR category. This is a very important conclusion, with regard to the explanatory power of the CPR theory, as applied to Internet goods. The analytical focus of the CPR theory was confined to a particular set of biophysical attributes: non-excludability combined with rivalrous consumption. This combination of the key biophysical attributes is almost non-existent in the case of Internet goods, so that the major concepts and theoretical underpinnings of mainstream CPR theory simply do not apply to the Internet commons.
Until recently, problems associated with network congestion have been perceived as similar to the excessive exploitation of natural resources, the main field of study by CPR scholars (Bernbom, 2002). However, technological advancements and growing bandwidth capacity have led to more efficient Internet connections and bigger file transmissions. Such revolutionary changes in the physical conditions are hardly to be expected in the case of natural resources.
Hence, there is an obvious need for extension of the existing theoretical analysis of commons, so as to encompass particular features of Internet goods. This will not be an easy task, particularly due to their diversified but, at the same time, interrelated structure. We need to develop new methodological tools to investigate how the key biophysical attributes used in CPR theory interact at the level of individual Internet goods and affect the functioning of the Internet as a composite structure.
An additional important feature of Internet goods is their duality. The same goods can, according to different institutional arrangements, fall into either the public, private or club goods category. Duality can be observed in cases of open or closed computer software or open and proprietary information resources. Another manifestation of duality is the previously mentioned personal computers that change their attributes when they work within a network. A piece of hardware, like a portable computer, when it is standalone represents a classic private good; but,, once it is connected to a network, it can reflect certain features of a club or semi-public good.
The rise of the Internet gives a new impulse for incorporation of other than CPR common goods, categorized with the use of non-excludability and joint consumption criteria; namely, the (pure) public goods and club goods. The former category meets both the non-excludability and non-rival consumption criteria -e.g. watching the sunset or benefitting from national defence protection was very narrow and, therefore, attracted limited attention from the research community. This needs to change once Internet goods are investigated, as many of them can be categorized as pure or semi-pure public goods, just to mention open standards and open access software, electronic repositories of freely available information and knowledge, etc. As discussed earlier, free access to information in the electronic format by millions of Internet users, combined with the active involvement in the generation of said information and knowledge, already have affected human civilization in all important spheres: economic, socio-cultural, political, etc. We need to study these new Internet goods, which calls for revisiting the concept of public goods in various disciplines within the social sciences.
The Internet commons also have opened new perspectives with respect to another category of broadly-defined common goods; namely, the club goods. Again, during the pre-Internet era, club goods (easy exclusion combined with non-rivalry consumption) were quite rare and predominantly local (country clubs, subscribed theatre services).This has changed radically due to the rise of the Internet. Contemporary 'clubs', which gather users of proprietary software and subscriptionbased proprietary databases, can have millions of members all over the world. Classic club goods used to be treated in academic analysis as impure private goods, whereas the key characteristics of Internet club goods render them closer to pure public goods. Let us take the example of the electronic repositories of scientific journal articles, which are accessible to scholars on the basis of subscriptions paid by their university. From the individual scholar perspective, such service can be viewed as a pure public good, as she/he is not confronted directly with restrictions in their use (except that the access code needs to be provided).
The above considerations reinforce the key argument that the rise of the Internet requires a major extension of the mainstream theory of commons. Some of the new elements discussed above can be incorporated within the existing neoinstitutional framework. However, there are other features and characteristics of the Internet commons which call for a more fundamental change in the overall conceptual framework of the mainstream CPR theory of commons. First, in the analysis, one needs to incorporate the impact of the radical changes in the history of civilization that have been brought about by the widespread application of the Internet and of information and communication technologies (ICT) in general.
The second argument calling for significant adjustments in the neoinstitutional theory of commons relates to important new attributes that can be identified in cases of Internet goods. True, additional characteristics of individual CPR goods typically have been included in analysis, particularly following the Institutional Analysis Development (IAD) framework. However, such characteristics have played only an auxiliary role relative to the two basic attributes, excludability and substractability (Ostrom, Hess 2007: 22-27). In the case of Internet goods, there are other distinct features which seem to be equally, if not more important than the two classic biophysical attributes.
In my view, the current mainstream theory of commons does not provide sufficient grounds for the comprehensive analysis of the new phenomena that have been brought about by the rise and rapid expansion of the Internet. At the same time, we have noted the development of new, interesting concepts, outside the mainstream theory of commons, which might be very useful in explaining the new dimensions of the Internet commons. The proposed scope and structure of the eclectic framework for the expanded theory of commons is depicted in Figure 2. There seems to be little doubt that the above characteristics are profoundly demonstrated in the case of ICT and more narrowly-defined Internet technologies.
What is crucial in the context of the Internet commons debate is that GPT theory emphasizes the broader complementarily effects of the Internet as the enabling technology changing the characteristics, as well as the modes of production and consumption of many other goods. If we take the book as an example, in the standalone paper form available in the bookstore, it is a classic private good. Once it becomes freely available over the Internet in an electronic form, it becomes a pure public good. When an access charge is imposed for its electronic version, it becomes a club good. But, at the same time, we may observe a radical transformation in the entire publishing industry, resulting from the wide application of ICT in the production and distribution of books in the traditional format (digital printing, and distribution via the Internet, e.g. Amazon). Less revolutionary, but very important for many users, is the availability of digitalized versions of books distributed through the traditional bookstores alongside their paper versions. We might quote many similar examples, pointing out the effects of the Internet as the enabling technology, some of which have been noted in previous sections. Thus, the concepts and key findings of GPT theory seem to be very useful for extending the scope of the mainstream theory of commons not only to include the specific characteristics of Internet goods, but also to illustrate the wider implications of Internet technologies for the common character of other goods.

Theoretical concepts reflecting the 'sharing potential' of Internet goods
In the preceding section, I identified an important attribute of the Internet commonsits 'sharing potential', which reflects the broadly-defined efficiency of joint consumption (use) of a given good, often by a very large and expanding number of users. Such joint use does not diminish a good's value; but, quite often, such value increases with the number of consumers. The sharing potential stays, in contrast with the characteristics of classic CPR goods, where exceeding a certain number of users causes consumption to become competitive. The useful concepts which might be helpful for explaining the sharing potential of Internet goods are network effects, sharable goods, and positive free riding.

a) Network effects
The network effects concept was introduced to the economic literature in 1985 by M.L. Katz and C.Shapiro, who used the term 'network externalities'. They noted that, in the case of many products, "the utility that a user derives from consumption of the good increases with the number of other agents consuming the good" (Katz, Shapiro, 1985:424). This concept was later refined by S. Liebowitz and S. Margolis, who argued that network effects, in their pure form, do not need to coincide with externalities; i.e. positive or negative effects on parties not involved in a given transaction (Liebowitz, Margolis, 1994). The added value of existing and new customers results from direct interactions between the users of a given good, as well as from the increased availability of complementary products and services.
Back in the 1980s and 1990s, when the network effects concept was introduced and refined, traditional telephone systems were viewed as the primary exemplification of these effects. Nowadays, with the rapid development of information technologies, the ICT sector is believed to demonstrate the most significant network effects, just to mention the development of the World Wide Web, where new essential functionalities were added once the number of Internet users increased. The same goes for websites facilitating exchanges via the Internet (eBay) or dedicated portals used for social networking. The developments taking place within the Internet spectrum clearly contrast with the classic CPR scenario. Specifically, the network effects increase radically once the number of users of a given Internet good reaches a certain threshold. For CPR goods, the increased number of users results in the excessive exploitation of a given resource.
While emphasizing the incidence and magnitude of positive network effects, particularly in the case of Internet commons, I shall point out negative implications, as well. These are reflected in the so called 'locked-in situations', profoundly exemplified in the market for professional software applications, presently dominated by Microsoft. The providers of alternative applications, often of superior quality, have faced major obstacles convincing potential clients to shift from Microsoft Office. The clients were reluctant to lose obvious benefits derived from the network effects: compatibility with the applications used by their business partners, standard installations on purchased hardware, access to auxiliary services, additional combatable applications, etc.

b) Positive free riding
The so-called 'free-riding problem' reflects the principal of social dilemma -a conflict between individual self-interest and the community. In a classic CPR scenario (costly exclusion and rivalrous consumption) the production of such goods normally would call for voluntary contributions from community members. However, this is in conflict with the maximization of individual benefits among community members who, following homo oeconomicus logic, are not be willing to pay for goods which are otherwise freely accessible.
Empirical studies on human behaviours in typical CPR scenarios have proven that homo oeconomicus logic may not be predominant (Stiglitz, 2004: 156). In the following section, I shall present additional concepts explaining diverse nonpecuniary motives among people involved in the consumption and production of Internet goods. In case of positive free riding, even while retaining the homo oeconomicus assumption, free riding behaviour can be beneficial for those users who are actively involved in the production of a given Internet good. "Internet reduces the cost of free riding… each free rider actually adds value to the final product. If the free rider is completely passive, she adds value by adding market share (and thus increasing the space for reputational returns to contributors). If she is just a little bit active, the free rider becomes a de facto tester and might just report a bug or request a new feature. The Internet reduces the communication costs of doing all that to just about zero" (Weber, 2000:36). P. Kollock calls Linux an 'impossible public good'; i.e., having characteristics making it impossible to materialize, if the classic free riding social dilemma remains intact. (Kollock, 1999:3-25).
There are obvious inter-linkages between positive free riding and the network effects discussed earlier. This is because network effects materialize with the increased number of users, including passive free riders (Weber, 2004:154).

c) Shareable goods
In the preceding analysis, I concentrated primarily on the largely positive effects derived from interactions between the increased number of consumers of Internet goods. In turn, the shareable goods concept, introduced by Y. Benkler, refers to intrinsic characteristics of some private goods (computers, wireless transceivers, Internet connections) which facilitate sharing with other users (Benkler, 2004). The key argument behind this concept is that some goods available on the market are 'lumpy'; i.e., they are produced in discrete sizes and represent varying capacities. Typically, available capacity is fully utilized only in specific time intervals; otherwise, it remains partially underused. Such idle capacity of some Internet goods can be shared with other users at practically no additional cost to the owner. Notable examples of capacity sharing are distributed computing projects and peer-to-peer networks.
It should be noted that the issue of available excess capacity is not restricted to Internet goods, but common in other sectors as well. The difference is that, in the former case, sharing is much easier and more efficient. To share excess storage capacity in a warehouse located on the company premises would call for a set of additional logistic arrangements (controlled access to the warehouse, separate storage space, sharing additional costs of monitoring and insurance, etc.). Such problems are practically non-existent when sharing excess computer processing capacity via the Internet network.
Definitely, certain attributes of Internet goods make them more suitable for sharing than other goods. However, the simplicity, cost-efficiency and functionality of sharing are dependent upon the Internet network facilitating the whole process.
The shareable goods theory can be extended easily to include the increased shareability of certain other private goods, once sharing is accomplished via the Internet.
Here, I shall refer to the role of information and communication technologies as enablers of innovation processes, spread across various industrial sectors. Clearly, there are numerous examples of increased share-ability as a result of Internet connectivity, just to mention carpool (especially for long distance travel) and apartment-sharing systems. On the other hand, the simplified, practically cost-free sharing opportunities that exist on the Internet may provide additional powerful stimuli to accelerate the production of specific goods, as profoundly demonstrated by the most recent success of the YouTube exchange of short amateur movies.
V. Non-hierarchical involvement in the production of Internet goods The theoretical concepts presented above help to explain the sharing potential of Internet commons, basically confined to the consumption process. But, even more important are those attributes of the Internet commons which facilitate sharing resources in various joint production undertakings. The issue of joint production shall not be separated from the particular organizational forms of such production and the motives of those involved in large collaborative undertakings, particularly IT professionals. The recently-formulated concepts of architecture of participation, peer production and the gift economy seem to be particularly relevant here.

a) Architecture of participation
The key argument behind the 'architecture of participation', concept as formulated by O'Reilly in 2004, was that the rules and principles laid down by the founders of the Internet during its early formation period established a very favourable and, in fact, encouraging environment for joining the network, even by non-experienced users, and subsequent engagement by those most active in various collaborative undertakings. The silent feature of Internet architecture is that web activities that are intended to satisfy individual, egoistic interests, irrespective of their intentions, contribute finally to the increased collective value (O'Reilly, 2004).
Among the basic rules that shape the participatory structure of the Internet, the endto-end principle should be mentioned first (Salzer, et al, 1984). This states that the network, as such, should remain as simple as possible, whereas all specialized elements (network intelligence) should reside with the computers of end-users and their software programs and applications. Second, efficient data transfer has been achieved with the help of an open TCP/IP network protocol, under which neither diverse operating systems nor computer brands were discriminated by the network. Everyone was allowed to use and exchange data with others, because the protocols were made to share, not to exclude (Lessig, 2001). One also shall point out the simplicity of the HTML language and the open HTTP protocol, allowing for the flexible and unrestricted creation of new websites, and for the expansion and updating of their contents, even by the users with very limited experience.
The participatory architecture of the Internet has been further reinforced by its new technological platform, called Web 2.0. New tools and technologies are meant to enhance the exchange of information and flexible engagement in various collaborative efforts and events. These new developments within the Internet spectrum can be seen as the exemplification of an emerging complex social phenomenon, coined by H. Jenkins as participatory culture. It reflects an ongoing process, whereby fans and consumers are effectively encouraged to participate in the creation and exchange of the new media content (Jenkins, 2007:496). The participatory culture largely contradicts the traditional notion of 'audience', which implied the generally passive, static role of the recipient of information and cultural goods. Needless to say, the Internet network plays crucial role as an 'enabler', making such cultural participation and exchange possible on a massive scale, with maximum flexibility and efficiency.

b) Peer production
Within the space of Internet many complex, collaborative undertakings -like the development of the open source software -are being successfully implemented without hierarchical control, required for projects of similar size performed within large, offline corporations. The peer production concept put forward by Benkler (2002) addresses this particular phenomenon. The term 'peer production' has been chosen deliberately to emphasize its distinction from the 'team production' theory formulated by Alchian and Demsetz (1972). They argued that, under conditions where it is very costly to ascertain the contribution of each team member to the value of the product, there is the risk of shirking or free riding. To avoid this, the system of hierarchical control within the firm must be put in place (Alchian & Demsetz, 1972).
Contrary to the above formulation, Benkler claims that, particularly with respect to information commons, neither the market nor hierarchical structure of a firm can provide an efficient coordination mechanism for the production of those goods. Such efficiency can be achieved under the alternative, third production model called 'peer production', which occurs when the coordination of large, complex projects is accomplished without hierarchical control, but team members are motivated, not by financial incentives, but by a range of non-pecuniary motives, including the professional satisfaction derived from creating something new and bringing value to the community (Benkler, 2002). The implementation of large-scale, open-source projects, involving thousands of programmers, and the Wikipedia phenomenon are primary examples of a peer production coordination mechanism.
Benkler points to four key characteristics which render team production particularly efficient with respect to information goods in a pervasively network environment: • Information is a purely non-rival good, both as an output and as an input to the production of other information goods; • Major technological breakthroughs in the ICT field have contributed to a radical decline in the costs of information production. • Creative talent, being the primary input in the production of information, is much more variable than traditional labour and material inputs; • The radical decline in the cost of exchanging information combined with the increased efficiency of such exchanges within a network economy facilitates the effective coordination of widely-dispersed resources (Benkler, 2002:404).
The above list of the key attributes of the information commons demonstrate that the basic foundations of the peer production concept are embedded in general purpose technologies theory. It is clear that the recent major breakthroughs in the ICT field were instrumental, as the key enablers for the effective implementation of a peer production coordination mechanism.
What is particularly interesting and, in fact, unique in the peer production concept is the analysis of alternative non-hierarchical methods by which to coordinate complex undertakings. This includes, inter alia, the following procedures: the self-allocation of tasks, the widespread use of peer-review for evaluation, and bottom-up initiatives, aimed at identifying best candidates (both among existing and new team members) to perform specific tasks. The educational background and the professional culture of software programmers seem to play important roles here. However, the evident success of the NASA Clickworkers project suggests that professionalism may not be that important. The Clickworkers project has attracted over 85 thousand volunteers involved in identifying (marking) craters on Mars. Its key source of success can be attributed to the project's organization, which allowed for great flexibility as to the scope and timing of individual engagements (Benkler, 2006:69-70).
c) The gift economy The gift economy, which contrasts with the contemporary 'exchange economy' refers to the social system wherein goods are given (shared) to the community by its members, without explicit assurance of personal return. This typically occurs within small communities which attach high value to the stable, robust relationships between community members, based upon sharing, collaboration, honour, trust, scalability and loyalty (Bollier, 2001:19). The socio-economic system of gift exchange has been studied by anthropologists in the context of traditional societies (Levi-Strauss, 1992;Mauss 1973,). Although many examples point to the apparent vitality of a gift economy within the contemporary capitalist system, the rise of the Internet has renowned interest in this concept. The key reason was its attractiveness for explaining primary motives of sharing of information and undertaking various collaborative efforts within a networked economy.
The role of non-pecuniary motives among team members engaging in large-scale projects already has been emphasized within the peer production concept discussed earlier.
In that respect, a gift economy offers a much broader and dynamic perspective. The rapid expansion of a networked information economy paves the way for broadening the spectrum of gift giving behaviours, eventually challenging the currently predominant 'exchange economy'. The Internet spectrum offers particularly favourable conditions, enabling and, in fact, encouraging the proliferation of giftsharing attitudes. First, the cost of sharing information is practically negligible. Second, there is great flexibility in the gift-giving process, in terms of the size and timing of the donation. Third, the effects of gift giving, in terms of added social value, can be recognized easily without additional effort. Even if the sharing of information, as such, is not driven by the expectation of reciprocal rewards, the visible effects can serve as a strong incentive for continuing said behaviours in the future.
At the same time, Internet commons represent a totally different framework for gift sharing, relative to small communities with established direct relationships and shared values between members. Information shared via the Internet typically is offered to a much larger and, what is even more important, unknown set of recipients. Kollock points out that such a 'general exchange system' is more generous; but, at the same time, it is riskier than a traditional gift exchange. The absence of direct social links may elevate the temptation to free ride (Kollock, 1999). The above argument can be challenged seriously, as, in fact, 'cyber-anonymity' may enhance the gift sharing behaviors that exist among Internet users, provided that an effective organizational framework ensuring anonymity is in place. The functioning of the various discussion groups may offer interesting insights to this issue. Most of the participants probably hesitate to share their experiences in person or through telephone conversations. At the same time, they are most willing to provide comprehensive valuable advice via the Internet, because they retain full anonymity.

CONCLUDING REMARKS
In conclusion, I want to emphasize that the rise of the Internet poses a serious challenge for the research community studying the commons, calling for major revision of theoretical concepts, methodologies and analytical apparatus. However, at the same time, it creates a fascinating ground for a radical revitalization of the theory of commons. This is because the object of the study -the Internet commonsis expanding at an unprecedented pace. Simultaneously, its role in the global economy is increasing exponentially, affecting all key spheres of human interaction. As exemplified in the preceding analysis, the structure of the Internet commons is highly diversified, thus calling for scientific exploration of its modalities and dimensions.
The current author believes that, considering the state-of-the-art of the theory of commons and the challenges connected with the rise and expansion of the Internet, the development of a single theoretical framework encompassing the whole diversity of the Internet commons may not be feasible. The alternative eclectic route seems to be more promising, particularly as interesting new concepts with direct relevance to the Internet have emerged in various branches of the social sciences. The proposed format, which identifies both open questions and the sources of theoretical inspiration, is by no means exhaustive and serves merely as an initial platform, facilitating academic discussion and, thereby, leading to refinement of the eclectic model.
Finally, a word of caution shall be offered. Researchers of the Internet commons are confronted, on one hand, with the limited timeframe of the analysis which results from the very brief history of the Internet. On the other hand, expansion of the Internet brings fundamental changes that affect practically all spheres of human interaction. This expansion calls for immediate reaction from the academic community. However, the former points to the significant risks of concentrating research endeavours on issues of a transient, nature while neglecting other, significant, long-term tendencies.