Abstracts

Panels

  1. Knowledge communities and networks of knowledge
  2. Emerging infrastructure for distributed learning and innovation
  3. Knowledge and place: proximity, mobility, clusters, and institutions
  4. Measuring knowledge and its economic effects
  5. Learning by doing, interacting, and experimenting: users as innovators
  6. Channeling knowledge: the changing role of research institutions
  7. Mixed models of cooperation and control
  8. Architecting knowledge: platforms, modularity, and coordination
  9. Fences and thickets: benefits and costs of private controls

 

Panel 1: Knowledge communities and networks of knowledge

Patrick Cohendet, Louis Pasteur University
Tom Schuller, OECD
John King, University of Michigan

On Knowing communities

Patrick Cohendet, Université Louis Pasteur (BETA) Strasbourg, and HEC Montréal

A Knowing community can be considered as the all-important site of knowledge formation; the site where hybrid knowledge inputs meaningfully interact. As Brown and Duguid (1991) suggest 'it is the organization's communities, at all levels, who are in contact with the environment and involved in interpretative sense making, congruence finding and adapting. It is from any site of such interactions that new insights can be co produced.' Accordingly, one can assume that the process of generating, accumulating, and distributing knowledge—both in sites of informal interaction and informally constituted units such as R & D labs—is achieved through the functioning of informal groups of people, or autonomous 'communities', acting under conditions of voluntary exchange and respect of the social norms that are defined within each group. Communities can be considered as key building blocks of the organization and management of corporate innovation and creativity.

Amongst various forms of knowing communities, the literature distinguishes two main specific types of cognitive communities:

  1. Communities of practice are groups of persons engaged in the same practice, communicating regularly with one another about their activities. Communities of practice can then be seen as a means for the group of agents to enhance their competencies in a given field of knowledge, and this goal is reached through the construction, the exchange and the sharing of a common repertoire of resources (Wenger, 1998).
  2. Epistemic communities : are small groups of "knowledge-creating agents who are engaged on a mutually recognized subset of questions, and who (at the very least) accept some commonly understood procedural authority as essential to the success of their collective activities." (Cowan et al., 2000, p. 234). Because of agents' heterogeneity, for the sake of knowledge creation, the first task of epistemic communities is to create a codebook . While communities of practice focus on the accumulation and exploitation in a given field of knowledge, epistemic communities tend to focus on the exploration of a new domain of knowledge.

As the Knowledge based economy will expand, we consider that communities will play an increasing role, because they can take in charge, through the passion and commitment of the members of the community to a common goal or practice, of some significant parts of the 'sunk costs' of the process of generation or accumulation of specialized parcels of knowledge. These sunk costs (and more generally, fixed costs) correspond for instance to the progressive construction of languages and models of action and interpretation that are required for the implementation of new knowledge, that cannot be covered through the classical signals of hierarchies (or markets). This setting is likely to compensate for some organizational limitations (learning failures) that firms are facing when confronted with the need to continuously innovate and produce new knowledge.

[ Top of Panel ] [ List of Panels ]

Social capital, networks and communities of knowledge

Tom Schuller, CERI/OECD

The paper addresses the following issues. The treatment will be largely schematic, but illustrated by reference to work on education systems and the role of SC in educational performance.

First, I consider the application of social capital theory to the notion of knowledge communities. Networks are fundamental to SC, but their conceptualisation and the analysis of how they work are often sketchy. I take the three standard forms of SC – bonding, bridging and linking – and discuss them in relation to four levels or strands of network: between individuals; institutions; disciplines; and professional groups.

Secondly, I outline a simple model of growth in knowledge societies, with three dimensions: accumulation, distribution and validation. I argue that the role of social capital may be important in all these dimensions, but particularly the third. Access to people whose judgment of the validity of putative knowledge can be trusted is important at all levels, from illiteracy to high professionalism.

Thirdly, previous OECD work on knowledge management has pointed to the role of users in innovation. I address the question of user involvement in the application of knowledge, with specific reference to the delivery of services. This includes some discussion of how the benefits of knowledge are to be measured.

[ Top of Panel ] [ List of Panels ]

Creation and sustainment of epistemic infrastructure

John King, University of Michigan

Epistemic infrastructure is the deep capability of societies to apprehend, reflect upon, refine, codify, and preserve vital cultural and social memory in ways that facilitate its transfer across distance and over time. Much of this infrastructure is recognized in educational apparatus at all levels, but a great deal of it is found in what are often called "cultural institutions" such as libraries, archives, museums, galleries, zoos, and other major collections. The rise of the internet and its ancillary affordances raises the possibility that epistemic infrastructure of the sort developed over centuries will be displaced, only to be replaced by new networks of cyberinfrastructure that fulfill or surpass the roles played by the earlier epistemic infrastructure. This is not only impossible, but it is undesirable; the new capabilities brought by the internet are highly complementary to existing epistemic infrastructure, and in many cases, provide the opportunity to leverage that infrastructure toward payoffs never before imaginable. The main problem at present is that much of the world of extant epistemic infrastructure fails to grasp the potential in this new technological regime, and those at the forefront of the new technological regime naively dismiss the importance of the older regime of epistemic infrastructure. This talk will address these issues, and lay out ideas for a more enlightened approach to public policy on this matter.

[ Top of Panel ] [ List of Panels ]

Panel 2: Emerging infrastructure for distributed learning and innovation

Dan Atkins, University of Michigan
Peter Freeman, National Science Foundation
Kurt Larsen, OECD

Emerging infrastructure for distributed learning and innovation

Dan Atkins, University of Michigan

This talk will review recent visions and activities around cyberinfrastructure and empowered of distributed but functionally completed knowledge communities. Leadership in this area is now coming primarily from science and engineering research communities (e-science) but it is now also being explored in the context of humanities as well as implications for the future education more broadly. The talk will related cyber infrastructure to the emergence of new forms of knowledge ecologies that hold promise for more synergistic and open approaches to learning, research, and meaningful engagements across disciplines, practice, cultures, and nations.

[ Top of Panel ] [ List of Panels ]

Cyberinfrastructure-in-the-making

Peter Freeman, National Science Foundation
Suzi Iacono, National Science Foundation

The opportunity currently exists to build upon the Nation's investments in high performance networking, supercomputing, virtual observatories and laboratories, middleware, and large-scale databases to integrate them into federated and interoperable science and engineering knowledge environments or cyberinfrastructure (CI). Harnessing the capabilities of these disparate and distributed resources will enable us to revolutionize science and engineering research and education and to engage in endeavors never before imaginable.

But can we get there from here? Unlike information systems built de novo or bought off the shelf, scientists must learn from and innovate on top of what currently exists, while the current technologies are in use, and across distributed and diverse user and developer communities. Within some fields and research communities, such as environmental science, biology, and engineering, one can find prototype CI systems that are compatible, interoperable, internetworked, and embedded in the everyday work lives of scientists and engineers. But across the vast array of resources that are currently available and the disparate communities that might like to use them, the ideal of a completely integrated yet personally configurable, reliable, and sustainable CI seems distant.

In this paper, three critical CI challenges in moving forward will be conceptualized as socio-technical design challenges: 1) the evergreen integration of emerging CI with extant CI to produce a stable yet constantly refreshed environment; 2) the development of a shared CI that is also personally configurable or domain-specific; and 3) the challenges of change, integration, interoperability and standardization at an inter-organizational level of analysis. Examples will be derived from extant prototype CI projects to better understand how these challenges are currently being worked on (or not). Recommendations will be made for further research.

[ Top of Panel ] [ List of Panels ]

ICT in tertiary education: An assessment of the promises, realities and challenges of e-learning

Kurt Larsen, OECD
Stephan Vincent-Lancrin, OECD

The promises of e-learning for transforming tertiary education and thereby advancing the knowledge economy have rested on three arguments: E-learning could expand and widen access to tertiary education and training; improve the quality of education; and reduce its cost. The paper evaluates these three promises with the sparse existing data and evidence and concludes that the reality has not been up to the promises so far.

Reflecting on the ways that would help develop e-learning further, it then identifies and proposes first answers to key questions that need to be considered:. 1) What are the most suited innovation models for the promotion of e-learning investments? One challenge lies in the building of sustainable costs/benefits models for the assessment of e-learning investments in tertiary education. 2) Open and proprietary e-learning initiatives are developed in parallel: What business models can best help the development of e-learning, and when? What intellectual property right issues are linked to these different initiatives? 3) What are the key elements that need to be changed in order to improve the access and relevance of e-learning material for students? 4) What are the incentives and barriers for university management and faculty to engage in e-learning?

[ Top of Panel ] [ List of Panels ]

Panel 3: Knowledge and place: proximity, mobility, clusters, and institutions

Maryann Feldman, University of Toronto
Andrew Wyckoff
, OECD
AnnaLee Saxenian, University of California, Berkeley
Jan Fagerberg, University of Oslo

The importance of proximity and location

Maryann Feldman, Rotman School of Management, University of Toronto

This Chapter considers the importance of spatial proximity and geographic location for the creation and use of knowledge. The motivation is to provide an understanding of the forces that contribute to the agglomeration of innovative activity and subsequently affect the growth of the firms, industries and cities. This Chapter reviews the body of research which examines the role of proximity and location to knowledge creation and highlights the importance of industry life cycle, the composition of activities within an agglomeration and the effect of existing industrial structure.

Firms located in geographically bounded knowledge-rich environments realize higher rates of innovation, increased entrepreneurial activity, and increased productivity due to the localized nature of knowledge creation and deployment. The ability of firms to derive economic value from knowledge is dependent on the firms' capabilities and strategic use of resources however the local environment shapes the firm's competencies, ability to absorb and utilize knowledge in the development of new products and access to resources. Thus, the capabilities of firms and regions weave a tapestry of knowledge creation and commercial success.

[ Top of Panel ] [ List of Panels ]

The changing dynamics of the global market for the highly-skilled

Andrew Wyckoff, OECD

During the 1990s, the US was able to sustain rapid growth in skill intensive industries like software, IT and R&D without running into severe shortages of scientists and engineers that would have slowed the expansion. This period of growth has made the US the exemplar of the "knowledge-based economy" and the benchmark by which many other countries compare themselves. This success is in spite of the fact that for decades, experts have warned that US competitiveness is threatened by the poor performance of its schools and the weakness of its students in fundamentals like reading, math and science. A prime explanation to this paradox is that the US has been able to attract the highly-skilled from abroad.

This paper analyses the global market for the highly-skilled. It marshals available data, albeit of a mixed-quality, to describe the state of this market at the turn of the 20th Century and then analyses the dynamics that are likely to alter this market in the near-term. A number of policy implications are outlined especially regarding the location of research and the related infrastructure.

[ Top of Panel ] [ List of Panels ]

Districts and diasporas: Local and global knowledge in a networked economy

AnnaLee Saxenian, University of California, Berkeley

The distinction between local and global knowledge is diminishing as corporate hierarchies are replaced with more open and federated, or networked, organizations. Industrial districts like Silicon Valley that support local learning and innovation are increasingly linked to distant regions through informal Diaspora networks. At the same time that these entrepreneurial ethic communities rely on new information and communication technologies to access new sources of skill in their home countries, they also collaborate with domestic policymakers to develop shared understandings of the way to overcome local obstacles to development. The resulting institutions support local learning-by-doing and bootstrapping, on one hand, and reciprocal regional upgrading, on the other, as local producers collaborate with distant specialists to design and produce new products. Knowledge is jointly created and diffused as the partners in this ongoing experimentation process develop shared languages and understandings. The co-evolution of distant (and differently specialized) firms and districts helps account for the economic dynamism in emerging regions of India and China today. While this discussion focuses on long distance partnerships, the analysis of the blurring of local (specialist, tacit) and global (system-level, explicit) knowledge is relevant at all geographic scales.

[ Top of Panel ] [ List of Panels ]

Knowledge in space: What hope for the poor parts of the globe?

Jan Fagerberg, Centre for Technology, Innovation and Culture, University of Oslo, Norway

This paper addresses an issue that has been highly contentious for years: the role of knowledge in catch-up/development. Already more a century ago Karl Marx pointed to the role of the richest countries as role models for the poor parts of the world. Ninety years ago Thorstein Veblen presented an intriguing analysis of the facilitating role played by (modern forms of) knowledge in German catch-up towards the then world leader, the United Kingdom. Fifty years ago the economic historian Alexander Gerschenkron returned to the topic with a somewhat less optimistic approach, emphasizing in particular the stringent requirements for its successful exploitation, what in a modern terminology is sometimes called "complementary factors" (or capabilities). Since then the controversy has lingered on with varying intensity. Recently the issue has been brought to the forefront again by the phenomenal growth (and catch-up) in China. This paper considers the various arguments that have been presented in the literature, and argues that the role of knowledge in development can only be properly assessed by basing the analysis on a thorough understanding of the role of knowledge in economic processes more generally. On this basis the paper presents some suggestions for how catch-up processes, and in particular the role of knowledge, should be studied and what the policy implications may be.

[ Top of Panel ] [ List of Panels ]

Panel 4: Measuring knowledge and its economic effects

Fred Gault, Statistics Canada
Reinhilde Veugelers, European Commision DG FIN
Jacques Mairesse
, Institut National de la Statistique et des Etudes Economique

Measuring knowledge and its economic effects: The role of official statistics

Fred Gault, Statistics Canada

Evidence-based policy requires timely and credible information. Statistical offices have provided such information for decades in support of fiscal, employment and industrial policies, but now, in the 21st century, the need is for new information on the knowledge products and processes that drive the economy.

This paper examines established indicators of knowledge creation (research and development), transmission (intellectual property (IP) commercialization, spin-off firms, and human resource mobility), and use (innovation and adoption of practices and technologies). It argues for moving beyond indicators of activities to the measurement of linkages between business, governments, and universities. Such indicators include measures of co-publication, commercialization of IP, foreign funding of R&D, and sources of ideas, practices and technologies for innovation. Measuring both activities and linkages is necessary for the understanding of a dynamic economic system. However, it is the measurement of the economic and social outcomes of the activities and linkages that is key to evidence-based policy making.

Indicators of knowledge creation, transmission and use are reviewed, leading to a discussion of measures of economic and social outcomes, and of gaps in both measurement and policy.

[ Top of Panel ] [ List of Panels ]

Assessing innovation capacity: fitting strategy and policy to the right framework

Reinhilde Veugelers, DG ECFIN / Katholieke Universiteit Leuven

Growth performance has been the subject of increasing scrutiny over recent years, a problem that Europe has addressed very aggressively. The much debated analysis of the contribution to overall productivity growth from ICT production and use, indicates the EU's difficulty in re-orientating its economy towards the newer, higher productivity, growth sectors such as ICT, as well as raising the broader issue of whether the EU is insufficiently capable of creating and exploiting new technologies in general. This has motivated a vision of a supremely competitive knowledge-based economy under the "Lisbon strategy." The strategy involves a broad set of structural reforms to encourage employment and productivity enhancing reforms as well as a set of indicators that are continuously monitored to assess progress on these reforms.

Since the overall innovation capacity of the US seems to dominate that of the EU's, the US experience is looked to in order to explain the EU-US differences in productivity growth performances, particularly in high-tech manufacturing industries. In addition to much larger investments in R&D by both the public and the private sector (i.e. the basic innovation infrastructure), the US appears to produce a higher rate of return from knowledge investments through the capacity to produce new ideas, to commercialize a flow of innovative technologies over the longer term, and to translate this flow into economic growth. Analysis must go beyond research inputs to include capacity to link public and private knowledge creators and link creators and users of knowledge. Sufficient 'demand pull' is needed for innovation to reward successful innovators, which requires sophisticated lead users willing to pay for innovations, effective intellectual property rights (IPR) schemes, a favorable macro-economic environment, well functioning financial markets, vigorous competition in output markets, and flexible product and labour markets.

All this requires a broad systemic framework that goes well beyond R&D budgets but which unfortunately includes many factors difficult to document with statistical indicators. We assess the assumptions behind the Lisbon strategy, i.e. the choice of policy priorities and structural reforms for tackling the deficiencies in the innovative capacity. We then analyse the set of indicators chosen to evaluate progress. Are these the right indicators for informing improvement in innovative capacity? What are the interactions and complementarities between the various reforms and indicators? We also consider the need to monitor and evaluate the indicators – and whether this should be done individually or at a systemic level, at aggregate or sectoral levels, and over what time frame.

[ Top of Panel ] [ List of Panels ]

The evaluation of R&D contribution to growth

Jacques Mairesse, Institut National de la Statistique et des Etudes Economique
Yusuf Kocoglu, Université de la Méditerranée

The paper will discuss the evaluation of R&D contribution to growth within the standard growth accounting framework. It will do so in parallel (and contrast) with the evaluation of ICT to growth, by considering three illustrative variants, which differ mainly in their assessment of spillovers and quality adjustment. This exercise in measurement is done for both the USA and France over the last twenty years. It raises some major problems of evaluation and interpretation.

[ Top of Panel ] [ List of Panels ]

Panel 5: Learning by doing, interacting, and experimenting: users as innovators

Bengt-Åke Lundvall, University of Aalborg
Eric von Hippel, Massachusetts Institute of Technology
Stefan Thomke
, Harvard Business School

Interactive learning, social capital and economic performance

Bengt-Åke Lundvall, University of Aalborg

There is a tendency among scholars working on innovation and knowledge not to confront macro-economic issues. Seen from the other side, with few exceptions, it has been quite acceptable among macroeconomists to assume that what happens at the macroeconomic level can be well understood without bothering too much about institutions related to innovation.

In this essay I will present some arguments for why this might not be correct. Specifically I will argue that the institutional preconditions for establishing interactive learning that interconnects users and producers in processes aiming at new products (social capital) has a major impact on economic performance of the economy as a whole.

Learning by interacting is fundamental since it transforms the outcomes of learning by doing and learning by using from being local to becoming non-local. Embodying knowledge in new services and products may be seen as an alternative to codification as mechanism of generalizing local knowledge.

The argument is predominantly conceptual and will be built up through references to contributions from a handful of outstanding economists (among them are Adam Smith, Kenneth Arrow, Nathan Rosenberg, Luigi Pasinetti, Oliver Williamson and Douglass North). The paper is a follow up of ideas first developed in the booklet 'Product Innovation and User-producer interaction' (Lundvall 1985).

[ Top of Panel ] [ List of Panels ]

Democratizing innovation

Eric von Hippel, Massachusetts Institute of Technology

Innovation is rapidly becoming democratized. Users, aided by improvements in computer and communications technology, increasingly can develop their own new products and services. User innovation, the data show, is strongly concentrated among lead users. These lead users--both individuals and firms--often freely share their innovations with others, creating user-innovation communities and a rich intellectual commons. The trend toward democratized innovation is visible both in information products like software and also in physical products. Lead user innovation provides a valuable feedstock for manufacturer innovation, and produces an increase in social welfare relative to a manufacturer-only innovation system.

Freely-revealed innovations by users form the basis for a user-centric innovation system that is so robust that it is actually driving manufacturers out of product design in some fields. I will suggest ways that manufacturers can redesign their innovation processes to adapt to newly-emerging user-centric innovation systems. Changes should also be made to governmental legislation and policies, such as the Digital Millennium Copyright Act, that inflict collateral damage on user innovation. The emergence of democratized innovation systems will be disruptive to some, but the end result appears to be well worth striving for!

[ Top of Panel ] [ List of Panels ]

Innovation and experimentation

Stefan Thomke, Harvard Business School

Experimentation matters because it fuels the discovery and creation of knowledge and thereby leads to the development and improvement of products, processes, systems, and organizations. Anything we use today arrives through a process of experimentation, over time; improved tools, new processes, and alternative technologies all have arisen because they have been worked out in various structured ways.

But experimentation has often been expensive in terms of the time involved and the labor expended, even as it has been essential to innovation. What has changed, particularly given new technologies available, is that it is now possible to perform more experiments in an economically viable way while accelerating the drive toward innovation. Not only can more experiments be run today, the kinds of experiments possible are expanding. Never before has it been so economically feasible to ask "what-if" questions and generate preliminary answers. These new technologies have also made possible new models for involving users in product development.

Specifically, by putting experimentation technologies into the hands of customers (in the form of "toolkits"), managers can tap into possibly the largest source of dormant experimentation capacity. Not only can shifting experimentation to customers result in faster development of products that are better suited to their needs, but their experimentation could also result in innovations that are not economical or too "complex" for companies to design. Eric von Hippel and myself have examined some of these new innovation models and, in addition to speaking about experimentation in general, I plan to report on some of our findings.

[ Top of Panel ] [ List of Panels ]

Panel 6: Channeling knowledge: the changing role of research institutions

Robin Cowan, University of Maastricht
Paul A. David, Stanford University / Oxford University
David Mowery, University of California / Harvard Business School / NBER

On the future role of universities and research institutes in the knowledge economy

Robin Cowan, Universiteit Maastricht

In recent decades our understanding of innovation has moved from a linear model to a systemic model, which sees an innovation system as a complex network containing many and varied links between different types of knowledge agents. Universities and research institutions are now described as (key) as nodes these systems. For specialized research institutions this is a natural role. For universities it is not; it represents a very dramatic shift in their place in society. At the same time, and perhaps as a consequence of attempts to insert universities as these central nodes, universities are being forced to adopt new ideas regarding success, excellence, and funding. This implies a very fundamental change in the way universities see themselves. At the same time, we have seen a growing commodification of knowledge within industry, as more and more complex operations are out-sourced. As attention is focused on "technology transfer" as a central mechanism of knowledge flow from universities to industry, this commodification is being pushed it into the university sphere as well.

In principle, this commodification removes knowledge from the context of learning, and often even of research. Historically, universities have been explicitly organized to hold these three things (research, learning and knowledge) together, and have thereby fulfilled both a social and an economic role. Focus on universities as major nodes in an innovation system is demanding not minor changes in university structure and culture but a major overhaul both of how universities function and of their role in society. Until some major re-organization has successfully taken place, tensions among the different university roles cultural, scientific and economic, will prevent the institution from fulfilling any of them effectively.

[ Top of Panel ] [ List of Panels ]

Building a cyberinfrastructure for enhanced scientific collaboration -- Its 'soft' parts may prove to be the toughest task

Paul A. David, Stanford University / Oxford Internet Institute

A new generation of information and communication infrastructures, including advanced Internet computing and Grid technologies, promises to enable more direct and shared access to more widely distributed computing resources than was previously possible. Scientific and technological collaboration, consequently, is more and more coming to be seen as critically dependent upon effective access to, and sharing of digital research data, and of the information tools that facilitate data being structured for efficient storage, search, retrieval, display and higher level analysis. A recent (February 2003) report to the U.S. NSF Directorate of Computer and Information System Engineering urged that funding be provided for a major enhancement of computer and network technologies, thereby creating a cyberinfrastructure whose facilities would support and transform the conduct of scientific and engineering research. The articulation of this programmatic vision reflects a widely shared expectation that solving the technical engineering problems associated with the advanced hardware and software systems of the cyberinfrastructure will yield revolutionary payoffs by empowering individual researchers and increasing the scale, scope and flexibility of collective research enterprises.

The argument of this paper, however, is that engineering breakthroughs alone will not be enough to achieve such an outcome; success in realizing the cyberinfrastructure's potential, if it is achieved, will more likely to be the resultant of a nexus of interrelated social, legal and technical transformations. The socio-institutional elements of a new infrastructure supporting collaboration – that is to say, its supposedly "softer" (non-engineering) parts -- are every bit as complicated as the hardware and computer software, and, indeed, may prove much harder to devise and implement. The roots of this latter class of challenges facing "e-Science" lie in the micro- and meso-level incentive structures created by the existing legal and administrative regimes. Although a number of these same conditions and circumstances appear to be equally significant obstacles to commercial provision of Grid services in inter-organizational contexts , the domain of publicly supported scientific collaboration will provide a more hospitable environment in which to experiment with a variety of new approaches to solving these problems. The paper concludes by proposing several "solution modalities," including some that also could be made applicable for fields of information-intensive collaboration in business and finance that must regularly transcends organizational boundaries.

[ Top of Panel ] [ List of Panels ]

The Bayh-Dole Act of 1980 and university-industry technology transfer: A policy model for other governments?

David C. Mowery, Haas School of Business, UC Berkeley / Harvard Business School / NBER
Bhaven Sampat, Georgia Institute of Technology / University of Michigan

Cross-border "emulation" of economic and technology policies among industrial-economy governments has become common in recent years, and spans policies ranging from intellectual property rights to collaborative R&D. Recent initiatives by a number of industrial-economy governments suggest considerable interest in emulating the Bayh-Dole Act of 1980, a piece of legislation that is widely credited with stimulating significant growth in university-industry technology transfer and research collaboration in the United States. I examine the effects of Bayh-Dole on university-industry collaboration and technology transfer in the United States, emphasizing the lengthy history of both activities prior to 1980 and noting the extent to which these activities are rooted in the incentives created by the unusual scale and structure (by comparison with Western Europe or Japan) of the U.S. higher education system. Efforts at "emulation" of the Bayh-Dole policy by other governments are likely to have modest success at best without greater attention to the underlying structural differences among the higher education systems of these nations.

[ Top of Panel ] [ List of Panels ]

Panel 7: Mixed models of cooperation and control

Iain Cockburn, Boston University / NBER
Josh Lerner
, Harvard Business School
Arti Rai, Duke University
Brian Fitzgerald, University of Limerick

Blurred boundaries: Tensions between open scientific resources and commercial exploitation of knowledge in biomedical research

Iain M. Cockburn, Boston University / NBER

Over the past 30 years, biomedical research has seen a dramatic extension of exclusion-based intellectual property into the domain of basic science. Pharmaceutical and biotechnology companies have become important participants in basic biomedical research, obtaining large numbers of patents on fundamental scientific knowledge. In parallel, universities and other non-profit entities have become important – and enthusiastic – participants in the patent system. Though conducted at a significant distance from actual sales of drugs to patients, intense market-based competition based on proprietary rights over biomedical knowledge now plays a very significant role in the overall rate and direction of innovation in this area. At the same time, rules and norms of governing production of knowledge in open science have diffused into commercial research with many commercial entities managing internal and external transactions in knowledge in ways that closely resemble academic research, emphasizing collaboration, interaction, peer review and publication.

The implications of this blurring of boundaries, with its concomitant realignment of interests, new actors, and new relationships, for the productivity of the global biomedical research enterprise are unclear. Increased competition, specialization, and market incentives may result in and faster and more effective realization of gains from the revolutionary advances in biomedical science. At the same time, the proliferation of patents and proprietary databases and the substitution of commercial goals for "pure" scientific inquiry may choke innovation. The new discipline of bioinformatics, which combines molecular biology and computer science to extract useful knowledge from vast amounts of data on DNA sequences, molecular structures, and the distribution of disease incidence and genetic variation in large populations, has become a vital aspect of biomedical research. But this is also an area where the patent system, proprietary rights, and open governance of knowledge and scientific information resources come into sharp conflict, illustrating some difficult dilemmas for policy makers. The interdisciplinary and boundary-spanning nature of bioinformatics brings two unresolved areas of controversy in intellectual property policy – patents on algorithmic inventions and genetic information – together, highlighting challenges for legal and administrative processes, and rendering this important new discipline particularly sensitive to choices and tradeoffs in the governance of knowledge.

[ Top of Panel ] [ List of Panels ]

Understanding the open source movement

Josh Lerner, Harvard Business School

This paper reviews our understanding of the growing open source movement. We highlight how many aspects of open source software appear initially puzzling to an economist. As we have acknowledged, our ability to answer confidently many of the issues raised here is likely to increase as the open source movement itself grows and evolves. At the same time, it is heartening to us how much of open source activities can be understood within existing economic frameworks, despite the presence of claims to the contrary. The labor and industrial organization literatures provide lenses through which the structure of open source projects, the role of contributors, and the movement's ongoing evolution can be viewed.

[ Top of Panel ] [ List of Panels ]

"Open and collaborative" biomedical research: A new model for biomedicine

Arti Rai, Duke University

The advent of open source software has prompted some theoretical speculation about the applicability of open source innovation principles to biomedical research. This paper moves beyond theoretical speculation into an empirical examination of biomedical research projects that operate under what might be called an "open and collaborative" model. Open and collaborative projects represent a fresh approach in that they both disavow biomedicine's exclusionary behavior and reject its small-lab based structure. The paper argues that such projects divide into three analytically distinct categories: software, databases, and "wet lab" biology. In the case of software and databases, open collaboration may dominate market-based innovation to the extent that such collaboration reduces associated transaction and access costs. In the case of software and data, however, the innovation task is modular. Thus, although innovation through markets may entail somewhat higher transaction costs than open collaboration, market-based innovation is still feasible. Thus the model's least intuitive, but most exciting, application may involve the non-modular area of "wet lab" systems biology: in this context, the model may allow a more coordinated and comprehensive attack than has heretofore been possible on the sorts of problems that cause promising drug candidates, particularly for complex diseases, to fail.

Open and collaborative biomedicine diverges in significant ways from non-biomedical open source innovation. Particularly outside the area of software, open and collaborative biomedicine may require restrictions on participation; significant centralization and standardization; reliance on public funding; and limitations on use of "copyleft" licensing. Additionally, if the model is to gain significant traction, institutional problems involving the division of consulting revenues between scientists and universities as well as inefficient biological science publication norms will have to be addressed.

[ Top of Panel ] [ List of Panels ]

The evolution of open source software

Brian Fitzgerald, University of Limerick

Routine development of software within organizations all over the world remains highly problematic. The term 'software crisis' was long ago coined to refer to the tripartite software development problems: namely, that the majority of software projects exceed their budget, fail to conform to their development schedule, and do not work as expected when eventually delivered. In recent times, the open source phenomenon has attracted considerable attention as a seemingly agile practice-led initiative that appears to address each of the three aspects of the software crisis: cost, time-scale and quality. open source products are freely available for public download. Thus, the cost issue is immediately addressed. From the point of view of development speed, the collaborative, parallel efforts of globally-distributed co-developers has allowed many open source products to be developed much more quickly than conventional software. In terms of quality, many open source products are recognized for their high standards of reliability, efficiency and robustness, and the open source phenomenon has produced several 'category killers' (i.e., products that remove any incentive to develop any competing products) in their respective areas – Gnu/Linux, Apache, Bind all spring to mind. The open source model also harnesses the most scarce resource of all – talented software developers. The resulting peer review model, comprising extremely talented individuals, serves to ensure the quality of the software produced. The open source concept itself is founded on the paradoxical premise that software source code—the 'crown jewels' for many proprietary software companies—should be provided freely to anyone who wishes to see it or modify it.

However, the OSS phenomenon is fraught with tensions and paradoxes, both in its early form, and as it has emerged to become a more commercial hybrid phenomenon. This presentation will consider some of these tensions in paradoxes in the initial emergence of OSS, including

We will then consider the recent emergence of the more hybrid form of OSS which we term OSS 2.0. Again, there are a number of tensions and paradoxes which threaten its stability, including:

Following this, we will look at the various business models that underpin OSS 2.0. Finally, we will consider some of the impacts of OSS 2.0 in terms of software development, management and in the wider sense.

[ Top of Panel ] [ List of Panels ]

Panel 8: Architecting knowledge: platforms, modularity, and coordination

Carliss Baldwin, Harvard Business School
W.E. Steinmueller, University of Sussex

Design architecture in a knowledge-based economy

Carliss Y. Baldwin, Harvard Business School
Kim B. Clark, Harvard Business School

Knowledge gets converted into useful things via designs. However, a "pure" design is, strictly speaking, only an idea. Unless the design is reified—made real, brought into reality— it cannot affect the physical world and cannot be used or consumed. Thus in order to affect the world and be valued, a design idea must be first completed and then made into something. Those actions in turn require human effort and human organization.

A design architecture is a partitioning of design parameters that both organizes and constrains the search for new designs. Modular design architectures partition design spaces in ways that facilitate the evolution of the system of modules. Such partitioning is especially important when the system is "option-rich," that is, when the modules have high technical potential hence the capacity to evolve into new, improved versions. In the field of information technology, examples of option-rich modular design architectures —ORMDAs for short—include IBM' System/360, the initial IBM personal computer architecture, VSLI chip architectures, the Internet, and Unix/Linux. As these examples indicate, ORMDAs are an important source of innovation, economic value and consumer welfare in knowledge-based economies.

ORMDAs interact with the economic system and with public policy in several ways. In particular, because they are "only" designs, ORMDAs "need" economic institutions and organizations for the following purposes:

(Of course, designs do not really need or want anything. Rather, people need or want solutions to their problems and those solutions in turn are based on designs being created, completed, realized, transferred, and paid for.)

In this paper, drawing examples from the greater computer industry, we will describe how ORMDAs interact with the institutional structures of a modern economy. The crux of our argument is that there are basically two institutional forms that work "on top of" an ORMDA:

Clusters and communities are the institutional forms best suited to meet an ORMDA's challenges. In modern, value-driven economies, these institutional forms are largely self-organizing. However, laws and public policy can affect the rate of design evolution within an ORMDA, the turbulence of competition, and the efficiency of design search processes within particular technological domains.

[ Top of Panel ] [ List of Panels ]

Technical compatibility standards and the co-ordination of the industrial and international division of labour

Ed Steinmueller, SPRU University of Sussex

Technical compatibility standards for inter-operability and inter-connection facilitate the inter-organisational and international division of labour. These standards are a prominent feature of the information and communication technology (ICT) industries where modularity of product design has contributed to rapid improvement in performance and dramatic declines in price. In many other industries, however, modularity has proven to be an elusive goal for both technical and economic reasons. Analysing modularity as a 'design discipline' suggests the difficulty of managing the complex body of knowledge required to construct well-delimited boundaries between sub-systems and the limitations to model and simulations of the operation of virtual prototypes in the design process. Analysis of the advance of boundary setting, which involves institutional and economic issues, and the modelling and simulation of virtual prototypes, which involves technical and economic issues, offers insights about constraints to the advance of modularity. This kind of analysis also indicates the potential role of public research in providing reference data and improved modelling and simulation methods for commercial adaptation and innovation. Examples from the mechanical, building, and software industries will be used to illustrate this argument.

Technical compatibility standards also involve important issues of industrial structure and power – they are a means for 'downstream' system integrators to enhance their potential profitability by strengthening competition among component and sub-system producers. An important feature of this process, the incentives governing co-operation between up and downstream firms was analysed by Teece using a framework comparing generic and co-specialised assets. This paper extends the Teece framework to the case of knowledge assets that support specific implementations of component and sub-system designs. Knowledge assets present important appropriation problems for which intellectual property protection of implementations is the assumed solution. Intellectual property provides a means for limiting monopsonistic market power of suppliers and distributing economic rents upstream to component and sub-system producers. This countervailing power, however, comes at the expense of a proliferation of technical compatibility standards that may constrain the progress of modularity and lower social welfare. Opportunities and constraints to public intervention in these processes are considered. This analysis is illustrated with examples from the personal computer and telecommunication industries.

[ Top of Panel ] [ List of Panels ]

Panel 9: Fences and thickets: benefits and costs of private controls

Dominique Guellec, European Patent Office
Adam Jaffe, Brandeis University
Dietmar Harhoff, University of Munich

The patent system and knowledge policies

Dominique Guellec, European Patent Office

The environment in which the patent system operates has experienced radical change over the recent past: With the increased economic importance of knowledge, with new ways of producing and diffusing knowledge, with increasing involvement of IP in various types of market strategies and market transactions. In order to continue to fulfill its traditional mission - encouraging innovation and diffusion of knowledge through market mechanisms - the patent system has to evolve. As part of its support to the Lisbon agenda - a series of policies agreed by EU governments in order to enhance economic growth in Europe - the European Patent Office is engaged in reflecting on these issues. This presentation will explore, based on the European experience, two particular challenges raised by the new environment to patent systems. A first challenge is selectivity: It is the role of patent offices - a major component of patent systems - to reject applications which do not reach certain criteria. Effective tools are available to patent offices for operating this selection, and the tool box should keep strengthening as the number of patent applications increases: E.g. a post-grant opposition system, the involvement of third parties in the examination process, the development of industry-specific skills. A second challenge is the closer integration of patent systems into the broader framework of innovation policies. Ensuring compatibility with competition policy, encouraging licensing, enhancing the diffusion and implementation of inventions from the public sector, promoting the diffusion of knowledge, be it proprietary or pertaining to the public domain, monitoring the use of patents as financial and fiscal leverages: These are new missions that patent systems should contribute to.

[ Top of Panel ] [ List of Panels ]

Patent Systems for the 21st Century Knowledge Economy

Adam B. Jaffe, Fred C. Hecht Professor in Economics and Dean of Arts and Sciences, Brandeis University

The institution of patents is in trouble. Increasingly, the cost and uncertainty associated with patent litigation is perceived as creating a hindrance to innovation and progress that more than offsets the benefits associated with the patent incentive. Debates rage as to whether patent protection is desirable or appropriate for certain technologies that are likely to be key drivers of innovation this century, including biotech and software.

The key to re-establishing government policy most conducive to innovation is not to abolish patents, nor to restrict their applicability to certain technologies but not others. It is to reform the process by which patents are examined and ultimately granted. The existing system—particularly in the U.S. but to some extent worldwide—is not structured in recognition of the pattern of knowledge location and flow in the 21st century. The system relies primarily or exclusively upon patent examiners to seek out and find any existing information relevant to the determination of the novelty and obviousness of a purported invention. This approach worked well enough when most such information resided in patent documents, but it is woefully inadequate today. The system has to be changed to create the opportunity and incentive for outside parties who hold information relevant to pending patent applications to bring that information forward, and to do so in a way that it is used effectively in the patent review process.

To bring about this change without creating logjams that could bring the system to a halt requires a staged approach in which most patents get only cursory review without outside contributions, some are subject to pre-grant opposition, and some are reviewed by the Patent Office post-grant in an effective re-examination process. Appropriate incentives are necessary for these increasingly stringent forms of review to be invoked as needed, but only as needed. With these reforms, we would restore a solid foundation to the "presumption of validity" enjoyed by granted patents, and obviate the need for carving out of the realm of patentable inventions vast swaths of modern technology.

This paper is based in large part on my book, co-authored with Josh Lerner of Harvard Business School, Innovation and Its Discontents: How our Broken Patent System is Endangering Innovation and Progress, and What do to About it.

[ Top of Panel ] [ List of Panels ]

Demand for Patents and the Evolution of Patent "Quality"

Dietmar Harhoff, University of Munich

This presentation surveys and assesses recent explanations of the global surge of patent applications. The most pertinent policy implications arise from theories that postulate a demise of "patent quality" as a corollary of the rise in patent application figures. The notion of "patent quality" is discussed in detail, and the empirical evidence in favor of the quality-quantity tradeoff is reviewed critically. The presentation concludes that this hypothesis has considerable empirical support. Moreover, the political economy of patent offices appears to be geared towards quantity rather than quality as the guiding principle. The study finally presents implications of this development for the diffusion of knowledge and for the context of invention and innovation in a pro-patent-quantity environment.

[ Top of Panel ] [ List of Panels ]