Diplo Internet Governance Community

Stay networked. Get informed. Broadcast your projects.

SSIG - The role of standards, protocols and codes in Internet governance

Here are some notes from Avri Doria, from Lulea Technology University, presentation:

Avri Doria: How can we look at protocols, standards and code from a principles-based perspective?

Do people writing code think they are doing Internet governance? Most of the time, no.

In documents we see "int their respective roles": how is that relevant to protocols, standards and code?

Code, standards, protocols are a major means by which these norms, rules decision making procedures and programmes are instantiated in the network. Most programmers do not think about policy, they are just solving problems. On the other side, politicians do not care much about what programmers are doing. This may be changing a little bit, but much less than we think.


There is a divided view on IG - the internet can be understood by reference to other institutions in society such as telecom media and trade, subject to the same rules and warranties the same form of analysis. On the other hand there are those who argue that it is different and requires new approaches.


We can use our existing knowledge to start understanding it. It is a system composed by a boundless complexity of code. This is what makes it different.

While the Internet is unique, it does not exist in a vacuum. Knowledge from other fields can be used to begin understanding, but it is just an analogy. Extreme care must be taken in trying to apply existing governance regimes. Using telecom rules and regimes to the Internet can result in more harm than good done.

What came first? The standard? The protocol or the code or the principles or something else? Is there a dialetical process?

When we talk about internet principles, we are thinking of design principles and organization principles.



What are design principles? Engineering constructs, when we have many possibilities and a choice as to be made. They enable a community of designers to work together.

The first one of these was the packed based network. It allows for a confederated network of networks where each network handles the datagram using the best paths that exist at that point in time according to its own policies. This creates a network with emerging properties. What you do in this network is pretty much decided at the ends.

End to end principle is one of the things that many heard of and it is largely misunderstood. It has been used as a political tool, abused and misunderstood. Its definition is the function in question can completely and correctly be implemented only with the knowledge and help of the application standing at the end points of the communication system.

When one argues where does packet ordering belong... where do you need to know where the address is? At the application level or another? In the end to end we identify where the ends are. What are the points where you have just enough information to actually fulfill a function.

The first definition was in 1980 (Saltzer et al). Some people today say that intelligence cannot be added inside the network - this is not mentioned in the principle! There is intelligence in the network for a variety of functions.

Postel robustness principle: be conservative in what you send and liberal in what you accept. RFC 793. When one reads the protocol and try to code it, there is a big gap in understanding what the authors tried to mean. When you receive something, and if it is a little bit different from what you expected, process it as it is good enough. This is critical for the functioning of a complex system: if they only accepted packets that were rigorously perfect, there would be a lot of crashes.

Layered architecture. You work at a specific problem at a specific layer. You start with data (when writing a message by email). Then the application layer is when the program put it into a specific format. The transport layer looks at it, may figure out it is too big and need to cut and place envelopes and metadata on it. Then there is the ip layer dealing with ip address going from one machine to the next using parameters telling how it will travel.

The layered architecture allows people to work in a more specific layer without having to worry about what would happen in other layers, bringing together different specialists collaborating in the development of the Internet at the same time.

Hourglass model. IP was specifically designed to be like an waist of hourglass - the application would not need to understand anything on the network it was in. This was a key factor in innovation. If you can keep innovating the network, whether wi-fi, wi-max or other, the applications would still work.


When we have mcast and QoS we get some extra waist added. NAT, firewalls, VoIP servers are middleboxes to the hourglass waist. So there is filtering and quality systems added to the model. Also, replacement and inversion, ATM, MPLS... getting the waist of the hourglass more complex.


Ultimagely, the hourglass waist gets like this:




Shared fate. Means that control information travels the network along the same transport as the data. This is fundamental to the management of the network. MPLS has broken that model to a certain extent and those being in the IETF discussions have seen a tug of war and disputes.

Creative anarchy. There was no top-down design. Anyone, anywhere could contribute to the next innovation. If someone had to go get permission for create voip or social networking, we would not have it. It is a fundamental property and some people point it to problems of spam and viruses.

Where do protocols and standards come from? Some are produced independently and become de facto standards. Some are produced by Standards Development Organizations through a variety of paths.

There are different definitions for SDOs: can be ITU (intergovernmental institution), industry bodies creating standards that are used by members in Request for Product, ad hoc groups that creates standards when adopted by the market - that's the IETF case. Or can also be a private entity that uses contractual conditions to impose its policy standards.

Code in the technical sense. BGP is used for routing in the Internet - it was not really matching the protocol standards.

Standards equalize the playing field. Could we have an Internet without standards? So who makes the standards? At the same time you do not get new players into a market without standards that can be used. Sometimes standards can limit innovation.

What about legitimacy? If code becomes policy, is it fair that those who can write code define the standards? Governments are still trying to find their role in the Internet - are they protectors or usurpers? Maybe somewhere in the middle. The coders have a feeling that they built the net and the government should not intervene.

What happens if this experiment in multistakeholder governance does not work?

Various forms of multistakeholder governance. IETF is where individual participant model in some ways the original multistakeholder organization in internet governance. But in other places, like ITU, the governemnt dominates. ICANN is dominated by industry. and ISOC and RIRs are dominated by the internet technical community.

Generalizations. The real issues are often at confluence of policy and technology: each can affect the other. Internet governance is a tussle of conflicting principles and priorities. Code governs what is possible. Policy covers what is allowed. Sometimes code comes first and dries policy. Sometimes policy comes first and directs code.


Creative Commons License
Blog posts written by Seiiti Arata on #SSIG is licensed under a Creative Commons Attribution-Noncommercial 3.0 Brazil License.

Views: 303

Comment

You need to be a member of Diplo Internet Governance Community to add comments!

Join Diplo Internet Governance Community

Comment by Seiiti on March 22, 2010 at 7:41pm
I made a question on the Q&A which goes like this:

I'm remembering the disputes that we had in the past, for example X400 v. SMTP or x25 and TCP/IP.

In these cases, a standard would become adopted. Which one?

At the time when code and functionality were the most important elements, the best solution works, and coders won.

But in that time where there was no multistakeholder model, the free market model, the most powerful decides. And those were the coders.

What about now, when Wolfgang mentioned the Vint Cerf quote: nobody cared... but today a lot of people care, because there is money to be made.

Let's look at Apple and the iPhone. Apple decides which apps are available, and even can decide that apple stores should not sell the plastic protection cover for iPhones. They can influence even the physical layer. But this represents a danger, that Jon Zittrain mentioned, of losing our generativity of the Internet.

So the question is, do you think today there are more risks in the definition of standards being captured by either Apple, ITU, single governments? If yes, how would the multistakeholder model be useful specifically for this challenge?

For Avri, she believes that to avoid capture in the adoption of standards, the coders should continue coding better so that they continue ahead.

Members

Groups

Follow us

Website and downloads

Visit Diplo's IG website, www.diplomacy.edu/ig for info on programmes, events, and resources.

The full text of the book An Introduction to Internet Governance (6th edition) is available here. The translated versions in Serbian/BCS, French, Spanish, Arabic, Russian, Chinese, and Portuguese are also available for download.

Interviews


Karlene Francis (Jamaica)
Ivar Hartmann
(Brazil)
Elona Taka (Albania)
Fahd Batayneh (Jordan)
Edward Muthiga (Kenya)
Nnenna Nwakanma (Côte d'Ivoire)
Xu Jing (China)
Gao Mosweu (Botswana)
Jamil Goheer (Pakistan)
Virginia (Ginger) Paque (Venezuela)
Tim Davies (UK)
Charity Gamboa-Embley (Philippines)
Rafik Dammak (Tunisia)
Jean-Yves Gatete (Burundi)
Guilherme Almeida (Brazil)
Magaly Pazello (Brazil)
Sergio Alves Júnior (Brazil)
Adela Danciu (Romania)
Simona Popa (Romania)
Marina Sokolova (Belarus)
Andreana Stankova (Bulgaria)
Vedran Djordjevic (Canada)
Maria Morozova (Ukraine)
David Kavanagh (Ireland)
Nino Gobronidze (Georgia)
Sorina Teleanu (Romania)
Cosmin Neagu (Romania)
Maja Rakovic (Serbia)
Elma Demir (Bosnia and Herzegovina)
Tatiana Chirev (Moldova)
Maja Lubarda (Slovenia)
Babatope Soremi (Nigeria)
Marilia Maciel (Brazil)
Raquel Gatto (Brazil)
Andrés Piazza (Argentina)
Nevena Ruzic (Serbia)
Deirdre Williams (St. Lucia)
Maureen Hilyard (Cook Islands)
Monica Abalo (Argentina)
Emmanuel Edet (Nigeria)
Mwende Njiraini (Kenya)
Marsha Guthrie (Jamaica)
Kassim M. AL-Hassani (Iraq)
Marília Maciel (Brazil)
Alfonso Avila (Mexico)
Pascal Bekono (Cameroon)

© 2020   Created by Community Owner.   Powered by

Badges  |  Report an Issue  |  Terms of Service