CCR Papers from October 2011

Find a CCR issue:
  • S. Keshav

    This editorial was motivated by a panel on the relationship between academia and industry at the SIGCOMM 2011 conference that was moderated by Bruce Davie. I can claim some familiarity with the topic having spent roughly ten years each in academia and industry during the last twenty years.

    My thesis is that although industry can make incremental gains, real technical breakthroughs can only come from academia. However, to have any impact, these academic breakthroughs must be motivated, at some level, by a real-world problem and the proposed solutions should be feasible, even if implausible. Therefore, it is in the self-interest of industry to fund risky, longterm, curiosity-driven academic research rather than sure-shot, short-term, practical research with welldefined objectives. Symmetrically, it is in the self-interest of academic researchers to tackle real-world problems motivated by the problems faced in industry and propose reasonable solutions.

    There are many underlying reasons why technological revolutions today can only come from academia. Perhaps the primary reason is that, unlike most industrial research labs of today (and I am purposely excluding the late, great, Bell Labs of yore), academia still supports long-term, curiosity-driven research. This is both risky and an inherently `wasteful’ use of time. Yet, this apparently wasteful work is the basis for many of today’s technologies, ranging from Google search to the World Wide Web to BSD Unix and Linux. On closer thought, this is not too surprising. Short-term, practical research requires the investigator to have well-defined goals. But revolutionary ideas cannot be reduced to bullet items on Powerpoint slides: they usually arise as unexpected outcomes of curiosity-driven research. Moreover, it takes time for ideas to mature and for the inevitable missteps to be detected and corrected. Industrial funding cycles of six months to a year are simply not set up to fund ideas whose maturation can take five or even ten years. In contrast, academic research built on the basis of academic tenure and unencumbered by the demands of the marketplace is the ideal locus for long-term work.

    Long-term, curiosity-driven research alone does not lead to revolutions. It must go hand-in-hand with an atmosphere of intellectual openness and rigour. Ideas must be freely exchanged and freely shot down.

    The latest work should be widely disseminated and incorporated into one’s thinking. This openness is antithetical to the dogma of `Intellectual Property’ by which most corporations are bound. Academia, thankfully, has mostly escaped from this intellectual prison. Moreover, industry is essentially incompatible with intellectual rigour: corporate researchers, by and large, cannot honestly comment on the quality of their own company’s products and services.

    A third ingredient in the revolutionary mix is the need for intense thinking by a dedicated group of researchers. Hands-on academic research tends to be carried out by young graduate students (under the supervision of their advisors) who are unburdened by either responsibilities or by knowing that something just cannot be done. Given training and guidance, given challenging goals, and given a soul-searing passion to make a difference in the world, a mere handful of researchers can do what corporate legions cannot.

    These three foundations of curiosity-driven research, intellectual openness, and intense thinking set academic research apart from the industrial research labs of today and are also the reason why the next technological revolution is likely to come from academia, not industry.

    In the foregoing, I admit that I have painted a rather rosy picture of academic research. It is important to recognize, however, that the same conditions that lead to breakthrough research also are susceptible to abuse. The freedom to pursue long-term ideas unconstrained by the marketplace can also lead to work that is shoddy and intellectually dishonest. For instance, I believe that it may be intellectually honest for a researcher to make assumptions that do not match current technology, but it is intellectually dishonest to make assumptions that violate the laws of physics. In a past editorial, I have written in more depth about these assumptions, so I will not belabour the point. I will merely remark here that it is incumbent on academic researchers not to abuse their freedom.

    A second inherent problem with academic research, especially in the field of computer networking, is that it is difficult, perhaps impossible, to do large-scale datadriven research. As a stark example, curiosity-driven work on ISP topology is impossible if ISPs sequester this data. Similarly, studying large-scale data centre topology is challenging when the largest data centre one can build in academia has only a few hundred servers.

    Finally, academic research tends to be self-driven and sometimes far removed from real-world problems. These real-world problems, which are faced daily by industrial researchers, can be intellectually demanding and their solution can be highly impactful. Academic researchers would benefit from dialogue with industrial researchers in posing and solving such problems.

    Given this context, the relationship between academia and industry becomes relatively clear. What academia has and industry needs is committed, focussed researchers and the potential for long-term, revolutionary work. What industry has and academia needs is exposure to real-world problems, large-scale data and systems, and funding. Therefore, it would be mutually beneficial for each party to contribute to the other. Here are a few specific suggestions how.

    First, industry should fund academic research without demanding concrete deliverables and unnecessary constraints. Of course, the research (and, in particular, the research assumptions) should be adequately monitored. But the overall expectation should be that academic work would be curiosity-driven, open, and long-term.

    Second, industry should try to expose academic researchers to fundamental real-world problems and put at their disposal the data that is needed for their solution. If necessary, academic researchers should be given access to large-scale systems to try out their solutions. This can be done without loss of intellectual property by having students and PIs visit industrial research labs as interns or during sabbaticals. It could also be done by having industrial researchers spend several weeks or months as visitors to university research labs.

    Third, industry should spend resources not only on funding, but on internal resources to match the output of academic research (papers and prototypes) to their own needs (products and systems).

    Fourth, academic researchers should choose research problems based not just on what is publishable, but (also) based on the potential for real-world impact. This would naturally turn them to problems faced by industry.

    Fifth, academic researchers should ensure that their solutions are feasible, even if implausible. For instance, a wireless system for cognitive radio built on USRP boards is implausible but feasible. In contrast, a wireless system that assumes that all radio coverage areas are perfectly circular is neither plausible nor feasible. This distinction should be emphasized in the academic review of technical papers.

    Finally, academic researchers should recognize the constraints under which industry operates and, to the extent possible, accommodate them. For instance, they should encourage students to take on internships, fight the inevitable battles with the university office of research to negotiate IP terms, and understand that their points of contact will change periodically due to the nature of corporate (re-)organizations.

    The SIG can also help this interaction. Industry-academic fora such as the panel at SIGCOMM, industryspecific workshops, and industry desks at conferences allow academic researchers to interact with representatives from industry. SIGCOMM could have tutorials focussed on topics of current interest to industry. These two efforts would certainly make deep collaboration between academia and industry more likely.

    I hope that these steps will move our community towards a future where academic research, though curiosity-driven, continues to drive real-world change because of its symbiotic relationship with industrial partners.

    This editorial benefited from comments from Bruce Davie and Gail Chopiak. 

  • Giuseppe Bianchi, Nico d'Heureuse, and Saverio Niccolini

    Several traffic monitoring applications may benefit from the availability of efficient mechanisms for approximately tracking smoothed time averages rather than raw counts. This paper provides two contributions in this direction. First, our analysis of Time-decaying Bloom filters, formerly proposed data structures devised to perform approximate Exponentially Weighted Moving Averages on streaming data, reveals two major shortcomings: biased estimation when measurements are read in arbitrary time instants, and slow operation resulting from the need to periodically update all the filter's counters at once. We thus propose a new construction, called On-demand Time-decaying Bloom filter, which relies on a continuous-time operation to overcome the accuracy/performance limitations of the original window-based approach. Second, we show how this new technique can be exploited in thedesign of high performance stream-based monitoring applications, by developing VoIPSTREAM, a proof-of-concept real-time analysis version of a formerly proposed system for telemarketing call detection. Our validation results, carried out over real telephony data, show how VoIPSTREAM closely mimics the feature extraction process and traffic analysis techniques implemented in the offline system, at a significantly higher processing speed, and without requiring any storage of per-user call detail records.

    Augustin Chaintreau
  • Tom Callahan, Mark Allman, Michael Rabinovich, and Owen Bell

    The Internet has changed dramatically in recent years. In particular, the fundamental change has occurred in terms of who generates most of the content, the variety of applications used and the diverse ways normal users connect to the Internet. These factors have led to an explosion of the amount of user-specific meta-information that is required to access Internet content (e.g., email addresses, URLs, social graphs). In this paper we describe a foundational service for storing and sharing user-specific meta-information and describe how this new abstraction could be utilized in current and future applications.

    Stefan Saroiu
  • Craig Partridge

    About ten years ago, Bob Lucky asked me for a list of open research questions in networking. I didn't have a ready list and reacted it would be good to have one. This essay is my (long- belated) reply.

  • Soumya Sen, Roch Guerin, and Kartik Hosanagar

    Should a new "platform" target a functionality-rich but complex and expensive design or instead opt for a bare-bone but cheaper one? This is a fundamental question with profound implications for the eventual success of any platform. A general answer is, however, elusive as it involves a complex trade-off between benefits and costs. The intent of this paper is to introduce an approach based on standard tools from the field of economics, which can offer some insight into this difficult question. We demonstrate its applicability by developing and solving a generic model that incorporates key interactions between platform stakeholders. The solution confirms that the "optimal" number of features a platform should offer strongly depends on variations in cost factors. More interestingly, it reveals a high sensitivity to small relative changes in those costs. The paper's contribution and motivation are in establishing the potential of such a cross-disciplinary approach for providing qualitative and quantitative insights into the complex question of platform design.

  • kc claffy

    In June 2011 I participated on a panel on network neutrality hosted at the June cybersecurity meeting of the DHS/SRI Infosec Technology Transition Council (ITTC), where "experts and leaders from the government, private, financial, IT, venture capitalist,and academia and science sectors came together to address the problem of identity theft and related criminal activity on the Internet." I recently wrote up some of my thoughts on that panel, including what network neutrality has to do with cybersecurity.

  • kc claffy

    I recently published this essay on CircleID on my thoughts on ICANN's recent decision to launch .XXX and the larger new gTLD program this year. Among other observations, I describe how .XXX marks a historical inflection point, where ICANN's board formally abandoned any responsibility to present an understanding of the ramifications of probable negative externalities ("harms") in setting its policies. That ICANN chose to relinquish this responsibility puts the U.S. government in the awkward position of trying to tighten the few inadequate controls that remain over ICANN, and leaves individual and responsible corporate citizens in the unenviable yet familiar position of bracing for the consequences.

Syndicate content