|
Layered Model |
- End to End - Broadband -Layers and Broadband Internet Access - Sec. 706 - Stimulus Plan - FCC Natl BB Plan - Dial Up - DSL - - Naked DSL - - UNE - - Net over Wireline (Info Service) - Cable - - Open Access - Fiber - Wireless - - 3G - - Wifi - - WiMax - - 700 Mhz - Powerline - Satellite - Muni Broadband - Telecom Services - Computer Inquiries - Network Neutrality - Forbearance - Backbones - - ICAIS - Layers - Interconnection - - Negotiation - Reciprocal Comp - Mergers - Federal Advisory Committees - FCC - NTIA - Universal Service - Statistics: Broadband |
From its origin, ARPANET / Internet followed a layered model. [DARPA 1981 II-19 ("A "layered" approach was taken such that more complex procedures (such as File Transfer Procedures) were built on top of simpler procedures in the host Network Control Program.")]
[DARPA 1981 III-58 ("The NWG adopted a "layered" approach to the specification of communicartions protocols, wherein the higher layers of protocol use the services of lower l3yers; the advantages and disadvantages of the layered approach are discussed elsewhere in thiz report. As shown in Figure 3, the lowest layer is the IMP-to-host protocol. The next layer (called the host-to-host layer in the figure) specifies methods of establishing communications paths between hosts, managing buffer space at each end of a communications path, etc. Next, the Initial Connection Protocol or ICP specifies a standard way for a remote user (or process) to attract the attention of a network host, preparatory to using that host: The ICP provides the analog of the user pressing the attention button at a local terminal on a host. In the next layer is the Telecommunications Network or TELNET protocol, which was designed to support terminal access to remote hosts. TELNET is a specification for a network standard terminal and the protocol for communicating between this standard terminal and a host. The next logical protocol layer consists of function oriented protocols, two of which, File Transfer Protocol (FTP) and Remote Job Entry protocol (RJE), are shown in the figure. Finally, at any point in the layering process, it is possible to superimpose ad hoc protocols.")]
Technological Design Becomes Policy Design
Reverberating Lessig's message from CODE, two branches of literature emerged arguing that network design principles necessitate network policy principles: End-to-end and the Layered Model. “What is” became “what ought to be.” [See David Hume]
|
|
|
Timothy Wu
Confounded by the new network, some technophobics sought to solve problems transpiring over the network by restraining the network itself. In 1999, Tim Wu in Application-Centered Internet Analysis suggested that policy scholars focus on the applications with which individuals interface:
This seemingly technical point matters because the Internet by its design allows - even encourages - great diversity above a few basic standards. The “end-to-end” design of the Internet delegates the power to code function to the point nearest to the user: the application. As a result, nearly everything that “counts” about the Internet from a legal standpoint is a function of the particular application at issue and not of the basic Internet protocols. Since applications actually drive Internet usage, they ought also drive legal analysis of the Internet, yielding nuanced rather than stereotyped results.
The design of the network permits disaggregation. When analyzing a problem, a solution is achieved not by regulating the network as a whole (the telecommunications, the Internet, the applications, or the content), but by specifically addressing the problem at hand. If, for example, the application of Internet gambling is the problem, then the solution should address the application of Internet gambling, and not be misdirected at, for example, restraining Internet addresses.
Richard Whitt
Richard Whitt in 2004 released A Horizontal Leap Forward: Formulating a New Communications Public Policy Framework Based on the Network Layers Model. The layered model in terms of policy essentially presents the same argument as the end-to-end principle, but from a different vantage point. The end-to-end principle separated the network (TCP/IP) from the activity at the ends (applications, and content). A horizontal layered model distinguishes the physical telecommunications network from the Internet (TCP/IP) from applications, and from content. Each layer performs assigned tasks, but not the task of other layers, making the different layers severable (almost like layers of interchangeable Lego blogs). Each layer conforms to protocols, so that it can interface with the layer above or below. This creates the Internet Hourglass. The bottom telecommunications layer performs its task of transmission and conforms to the protocols of the layer above, the Internet layer. The Internet layer performs its task of creating and routing packets, providing interconnectivity over a packet-switched network. In the layer above, applications can be devised that can do any task and will work as long as they conform to the Internet protocol. In this manner, any application will work over any telecommunications network, with the Internet acting as the glue allowing separate parts to work together.
Whitt took this design principle to respond to a problem stemming from Communications Act itself. In 1934, Congress concluded that it would be best to have an expert federal communications agency, and created the Federal Communications Commission and enacted the Communications Act of 1934 by gathering preexisting authority from the Federal Radio Commission, the Interstate Commerce Commission, and other places, and gluing them together. As new mediums emerged such as satellite communications and cable services, Congress glued additional titles onto the Act. The end result was regulatory silos. Each area of communications was its own market, with its own title within the Act, with its own bureau within the FCC. Telephone service had the common carrier bureau while television had the media bureau. Since these services were distinct, delivered over separate infrastructure, this siloed approach was sufficient.
Whitt argues that the old siloed approach is defunct in an era of convergence, and offers the Layered Model of regulation as an alternative:To avoid the risk of further serious damage, policy makers must move away from the increasingly outmoded vertical “silos” that artificially separate communications-related services, networks, and industries from each other. Informed by the way that engineers create layered protocol models, and inspired by the analytical work of noted academics and technology experts, policy makers should adopt a comprehensive legal and regulatory framework founded on the Internet’s horizontal network layers. We must build our laws around the Internet, rather than the other way around. By tracking the architectural model of the Internet—with IP at the center—we can develop a powerful analytical tool providing granular market analysis within each layer, which in turn puts public policy on a more sure empirical footing.
[Whitt 2004 p. 591]
This breaks the network down into sub-problems. The separability of these different layers enabled different markets. Different layers are serviced by different equipment vendors and by different service providers. They have different market dynamics, and different legal and policy concerns. Instead of regulatory silos, where one FCC Bureau grapples with the total regulatory question of, for example, telephone service, the layered model clarifies that the physical network layer is an FCC problem, the Internet layer may be an NTIA problem, applications and services may be an FTC or DOJ problem, and content may be a copyright problem. As Whitt notes, the FCC broke from the siloed model and implicitly adopted a layered model in the FCC’s Computer Inquiries (Steve J. Lukasik, Director of ARPA until 1975, joined the FCC as Chief Technologist in 1979 may have had significant influence over the Computer Inquiries)
Roberts, Computer Science Museum 1988 ("The thing split and took two paths in the midst of the '71 to '73 time frame, or whatever, when I left, I then took it off into X-25 and that virtual circuit direction, and we still had the seven-layer model. We introduced the first three in X- 25, and the next levels were in the host interface. Bob Kahn and Vint Cerf went ahead with TCP and pushed the current original datagram concepts even further. Everybody agreed -- we had more heardaches with anything that didn't obey layers in the whole system, from time to begin with -- every time anybody violated a layer we got in trouble. So, everybody knew that you needed as cleanly defined layers as possible, and that went on even very early, even in the first implementations, in terms of the file protocols and on top of all of the communications protocols. So it was very clear from early on, and I think -- well, you may have researched who was best at stating it; everybody experienced it.")
Web over dial-up |
Telephony over DSL |
Provider |
||||
Hourglass
OSI
"FIPS 146-1 adopted the Government Open Systems Interconnection Profile (GOSIP) which defines a common set of Open Systems Interconnection (OSI) protocols that enable systems developed by different vendors to interoperate and the users of different applications of those systems to exchange information. This change modifies FIPS 146-1 by removing the requirement that Federal agencies specify GOSIP protocols when they acquire networking products and services and communications systems and services. This change references additional specifications that Federal agencies may use in acquiring data communications protocols. " FIPS 146-2, Profiles for Open Systems Internetworking Technologies (POSIT), NIST (May 15, 1995)
"In October 1993, NIST established the Federal Internetworking Requirements Panel to study and recommend policies on the use of networking standards by the Federal government. Based on feedback from industry, individual users, and international organizations on its draft report, the Panel submitted its final recommendations for public comment on May 1994. The Panel concluded that no single networking protocol suite meets the full range of government requirements for data internetworking. The Panel recommended that Federal government agencies select standards based on their interoperability needs, existing infrastructure, costs, marketplace products, and the degree to which the protocol has been adopted as a standard. As follow-up, NIST has proposed changes to the Federal Information Processing Standard that will remove the requirement specifying use of the Government Open Systems Interconnection Profile (GOSIP) protocols when agencies acquire networking and communication products. NIST currently is soliciting public comment on these proposed changes and will issue a final version in early 1995." - Department of Commerce, National Information Infrastructure Progress Report p 11 September 1993-1994.
- FIPS 146-2, Profiles for Open Systems Internetworking Technologies (POSIT), NIST (May 15, 1995)
- Department of Commerce, National Information Infrastructure Progress Report p 11 September 1993-1994.
- OSI Protocols, CISCO
- OSI Reference Model
© Cybertelecom ::Papers
- Mark Handley, On Inter Layer Assumptions, IAB
- Yoo, Christopher S., Protocol Layering and Internet Policy (2013). University of Pennsylvania Law Review, Vol. 161, P. 1707, 2013; U of Penn, Inst for Law & Econ Research Paper No. 13-18.
- Andrew L Russell, “OSI: The Internet That Wasn’t,” IEEE Spectrum 50 (August 2013).
- Whitt, Richard S., A Deference to Protocol: Fashioning a Three-Dimensional Public Policy Framework for the Internet Era (September 14, 2012). 2012 TRPC.
- Information Economy Report 2006: Chapter 7 The Layered Internet Architecture: Governance Principles and Policies p. 275, United Nations Conference on Trade and Development (Nov. 2006)
- Andrew L Russell, “‘Rough Consensus and Running Code’ and the Internet-OSI Standards War,” IEEE Annals of the History of Computing 28:3 (2006): 48-61.
- Sicker & Blumensaadt, Misunderstanding the Layered Models, J. Telecomm. & High Tech. L., 44, (Fall 2005)
- "Beyond the Layered Model" by J. Scott Marcus , Silicon Flat Irons PPT
- Marcus, J. Scott, Beyond Layers (May 9, 2006) (“The Layered approach offers surprisingly little guidance to the regulator”).
- Layer architectures and regulation in telecommunications, A. M. Odlyzko. Pages 16-19 in New Millennium Research Council report, Free Ride: Deficiencies of the MCI 'Layers' Policy Model and the Need for Principles that Encourage Competition in the New IP World, July 2004. [preprint, text] [full NMRC report, PDF]
- Rick Whitt, MCI, A Horizontal Leap Forward: Formulating a New Communications Policy Framework Based on the Network Layers Model, 56 Fed. Comm. L.J. 587 (2004)
- "To avoid the risk of further serious damage, policy makers must move away from the increasingly outmoded vertical “silos” that artificially separate communications-related services, networks, and industries from each other. Informed by the way that engineers create layered protocol models, and inspired by the analytical work of noted academics and technology experts, policy makers should adopt a comprehensive legal and regulatory framework founded on the Internet’s horizontal network layers. We must build our laws around the Internet, rather than the other way around. By tracking the architectural model of the Internet—with IP at the center—we can develop a powerful analytical tool providing granular market analysis within each layer, which in turn puts public policy on a more sure empirical footing."
- A. M. Odlyzko, Layer architectures and regulation in telecommunications, p. 16-19 in New Millennium Research Council report, Free Ride: Deficiencies of the MCI 'Layers' Policy Model and the Need for Principles that Encourage Competition in the New IP World, July 2004. [preprint, text] [full NMRC report, PDF]
- The Layers Principle: Internet Architecture and the Law, SSRN 7/18/03
- Craig McTaggert, A Layered Approach to Internet Legal Analysis, 48 MCGILL L.J. 571 (2003)
- Lawrence B. Solum & Minn Chung, The Layers Principle: Internet Architecture and the Law, (University of San Diego School of Law, Public Law and Legal Theory Research Paper No. 55) (2003)
- Rob Frieden, Adjusting the Horizontal and Vertical in Telecommunications Regulation: A Comparison of the Traditional and a New Layered Approach, PSU 2/4/03
- The Potential Relevance to the United States of the European Union’s Newly Adopted Regulatory Framework for Telecommunications by J. Scott Marcus; July 2002. [ Text | Word | Acrobat | News Release ]
- Douglas C. Sicker, Further Defining a Layered Model for Telecommunications Policy, TPRC 2002 Download: [Abstract] [PDF]
- Robert Cannon, Will the Real Internet Please Stand Up: A Quest to Define the Internet, TPRC 2002 Download: [HTML]
- Sicker & Mindel, "Refinements of a Layered Model For Telecommunications Policy," The Journal on Telecommunications and High Technology Law, Volume I, 2002
- Robert Cannon, The Legacy of the Computer Inquiries
- Frieden, Adjusting the Horizontal and Vertical in Telecommunications Regulation: A Comparison of the Traditional and a New Layered Approach, TPRC 9/18/02
- Sicker, Further Defining a Layered Model for Telecommunications Policy, TPRC 9/18/02
- Kevin Werbach, A Layered Model for Internet Policy, 1 J. Telecomm. & High Tech. L. 37 (2002) Draft Sept 2000. Edventure| TPRC pdf
- The Computer Inquiry rules are set forth in the following White Paper: Where ISPs and Telephone Companies Compete: A Guide to the Computer Inquiries, Enhanced Service Providers and Information Service Providers (March 2001) | Word | Published in Commlaw Conspectus and TPRC Proceedings 2000.
- Robert M Entman, Rapporteur, Transition to an IP Environment, The Aspen Institute (2001)
- Michael L. Katz, Thoughts on the Implication of Technological Change for Telecommunications Policy, The Aspen Institute (2001)
- Computer Science and Telecommunications Board, National Research Council, Broadband: Bringing Home the Bits, p. 182 (2002) (calling for logical layer unbundling).
- "A Paradigm Shift for the Stupid Network: Interconnecting with Legacy Networks in the Internet Era" by Timothy Denton and François Ménard, June 15, 2000 (293 KB)
- Timothy Wu, Application-Centered Internet Analysis, 85 VA. L. REV. 1163 (1999).
- "This seemingly technical point matters because the Internet by its design allows - even encourages - great diversity above a few basic standards. The "end-to-end" design of the Internet delegates the power to code function to the point nearest to the user: the application. As a result, nearly everything that "counts" about the Internet from a legal standpoint is a function of the particular application at issue and not of the basic Internet protocols. Since applications actually drive Internet usage, they ought also drive legal analysis of the Internet, yielding nuanced rather than stereotyped results."
- J. Weinberg, "The Internet and Telecommunications Services, Universal Service Mechanisms, Access Charges and Other Flotsam of the Regulatory System," TPRC, 1998.
- Kevin Werbach, Digital Tornado: The Internet and Telecommunications Policy, FCC Office of Plans and Policy Working Paper No. 29, p. 1 (March 1997) ("The Internet functions as a series of layers, as increasingly complex and specific components are superimposed on but independent from other components")
- CSTB, Realizing the Info Future p. 5 1994 ("The Open Data Network proposed in this report involves a four-level layered architecture configured as follows: (1) at the lowest level is an abstract bit-level service, the bearer service, which is realized out of the lines, switches, and other elements of networking technology; (2) above this level is the transport level, with functionality that transforms the basic bearer service into the proper infrastructure for higher-level applications (as is done in today's Internet by the TCP protocol) and with coding formats to support various kinds of traffic (e.g., voice, video, fax); (3) above the transport level is the middleware , with commonly used functions (e.g., file system support, privacy assurance, billing and collection, and network directory services); and (4) at the upper level are the applications with which users interact directly. This layered approach with well-defined boundaries permits fair and open competition among providers of all sorts at each of the layers.")
- F. M. Bar, "Configuring the Telecommunications Infrastructure for the Computer Age: The Economics of Network Control," Ph.D. thesis, University of California, Berkeley (1990)
Books
- Janet Abbate, Inventing the Internet (MIT Press 1999) Recounting how the layered model was adopted for ARPANET
- P 66: "The initial division between subnet and host layers had simplified the work of the network's designers; now the [Network Control Center] NCC allowed the network's users to ignore much of the operational complexity of the subnet and to view the entire communications layer as a black box operated by Bolt, Beranek and Newman [BBN]. The NCC had become a managerial reinforcement of ARPA's layering scheme."
- P 67: "Roberts suggested separating the host functions into two layers. The first, called the "host layer," would feature a general-purpose protocol to set up communications between a pair of hosts; the second, called the "application layer," would specify protocols for network applications such as remote login or file transfer. Having separate host and application layers would simply the host protocol and lessen the burden on the host system's programmers. Also, eliminating the need for each application to duplicate the work of setting up a host-to-host connection would make it easier to create applications programs, thereby encouraging people to add to the pool of network resources. The ARPANet model now had three layers...." This model would be reflected in the Network Control Protocol (NCP)
- Katie Hafner and Matthew Lyon, Where Wizards Stay Up Late: The Origins of the Internet, p. 147 (1996):
- "Whatever structure they chose, they knew they wanted it to be as open, adaptable, and accessible to inventiveness as possible. The general view was that any protocol was a potential building block, and so the best approach was to define simple protocols, each limited in scope, with the expectation that any of them might someday be joined or modified in various unanticipated ways. The protocol design philosophy adopted by the NWG broke ground for what came to be widely accepted as the “layered” approach to protocols."