- Open Technologies for an Open World
Open Standards, Open Source, Open Mind
Networks, Network enterprises, Peer-to-peer, Privacy
5. The Network Society
"It is not proper to think of networks as connecting computers. Rather, they connect people using computers to mediate. The great success of the Internet is not technical, but in human impact. Electronic mail may not be a wonderful advance in Computer Science, but it is a whole new way for people to communicate. The continued growth of the Internet is a technical challenge to all of us, but we must never loose sight of where we came from, the great change we have worked on the larger computer community, and the great potential we have for future change." - David Clark
Licklider, as David Clark, was among the first to notice the social importance of networks, and to envision the future paradigm of a global network. It is important that the technology evolved as we analysed in the previous chapter, providing the infrastructure that allowed this network to be possible. However, the computers, network devices and protocols were simply the media to get the communication performed. Similar to the psychophysical definition of the sound, as a wave transmission that is perceived by the ear, the information available in the network only becomes interesting when it's consulted by a user. The users of a network are the elements that count, as stated by Metcalfe: "The community value of a network grows as the square of the number of its users increase" . The mission of people like Metcalfe was to be like the first architects that built cities and planned squares, working to give the material conditions for the people to meet and communicate.
We saw that the first networking concepts were built around the definition of common protocols, open standards and the open systems. After the networks were established, they have been used to improve the communication among the researchers, and revealed to be the best way to exchange information, discuss ideas, have agreements, publish them, and restart the cycle. This created a positive spiral, which elevated the performance of the network :
§ With an increasing consistency, due to the high sharing of knowledge between the users, aimed to increase the efficiency and usage of the network.
§ With a high connectedness, due to the increasing efficiency and usage of the network, and to its reduced cost and complexity.
We may also note two external factors with participated in the factors above:
§ The incentive of the governments - and most specifically the military organizations during the cold war - that funded the infrastructure and sponsored the academic institutions. This allowed the research to produce the knowledge and be connected to the network.
§ The increasing performance of the microchips at a given price, as specified in the Moore's law , which implied in reducing costs of computer power and hardware components, allowing more users to be connected to the network.
5.2. The Network Enterprise
5.2.1. From merchant networks
Initially, the networks were used by high-tech companies - like ATT's Bell labs, Rand Corporation, BBN (Bolt, Beranek & Newman), IBM - to exchange information with the scientific community, and to progressively build stronger infrastructure devices, protocols and software. IBM still uses its revolutionary proprietary protocols and standards to promote a cohesive work among the different R&D centres all over the world, smoothly integrate them with the other departments and - by the usage of standard interfaces, TCP/IP and OSI-compliant protocols - with the academic institutions. IBM understood it should increase its sources from all forms of knowledge, to remain an innovative firm. Innovation was critical and companies like Sun and 3COM have flourished in this environment. The computer scientists participating of this networked environment alternated or cumulated jobs in the industry and in the academic institutions. This created a networked milieu of innovation whose dynamics and goals became largely autonomous from the specific initial purposes .
5.2.2. To the merchantable network
Later all the enterprises started to be connected, via the network, with suppliers, customers, service providers and research laboratories over the world, in a multicultural framework, forming network enterprises, and founding the global economy. Thus, the information economy emerged in a planetary level, in different cultural/national contexts, evolving around a common matrix of organisational form in the processes of production, consumption, and distribution .
Castells identified an important shift from vertical bureaucracies (the hierarchical oligopolies from the industrial era) to the horizontal corporation (the networked companies that survived and thrived in the informational economy), "dynamic and strategically planned network of self-programmed, self-directed units based on decentralization, participation, and coordination. ( ) The manner in which a company shares information and systems is a critical element in the strength of its relationships" .
The networks formed by the horizontal corporations are divided, according to Ernst , into intra-firm (link different divisions and business functions from the company) and inter-firm (normally relying suppliers, producers or customers). However, the scope of our study is to identify two other important inter-firm connections, also identified by Ernst:
§ Standards coalitions - initiated by potential global standard setters with the explicit purpose of locking-in as many firms as possible into their proprietary product, architectural, or interface standards. This is the case of Wintel (Microsoft and Intel association explained on the chapter 2.2.4) and .Net (discussed on chapter 3.2.2).
§ Technology cooperation networks - built to facilitate the exchange and joint development of product design and production technology, involving cross licensing and patent swapping, and permit the sharing of R&D. Under such arrangements, knowledge typically flows in both directions and all participants need to master a broad array of technological capabilities. The mainframe architecture (explained on the chapters 2.1.1 and 2.2.1), MDA (discussed on chapter 3.1.1) and Java (discussed on chapter 3.2.1) illustrate this.
Peer-to-peer and collective conscience
In parallel with the formation of the network enterprises, academic people started to use the network to build strong communities of interest, which started by exchanging ideas and finally discovered a potential to defy and compete big monopolies. New companies were created to exploit the new technologic developments in ways unanticipated by the scientists and big companies - it was the case of Apple, Microsoft and Intel - reducing the entry price to be part of the network. The benefits of being in the network grew exponentially, because of the greater number of connections. This gave birth to the modern counterculture movements like the hackers and the crackers; it helped to break geographic barriers in the consolidation of currents of thoughts, like anti-globalisation; it helped the formation of worldwide/underworld organizations like the (cyber) terrorism.
5.3.1. Online communities
One of the start points of the online subculture was the set of science networks (like ARPANET, CSNET, BITNET) being used to exchange personal messages around subjects like science fiction, but the advent of personal computing and cheap networking equipment gave birth to the BBS (Bulletin Board Systems). Described by Rheingold as "a grassroots element to the Net that was not, until very recently, involved with all the high-tech, top-secret doings that led to ARPANET ( ). Real grassroots, the kind that grow in the ground, are a self-similar branching structure, a network of networks. Each grass seed grows a branching set of roots, and then many more smaller roots grow off those; the roots of each grass plant interconnect physically with the roots of adjacent plants, as any gardener who has tried to uproot a lawn has learned." . Rheingold - who was an active member from an online community called WELL and specializes today in the creation of new communities - uses the term virtual communities to define the social phenomenon spawned from the BBS. Virtual communities are social aggregations that emerge from the Net when enough people carry on those public discussions long enough, with sufficient human feeling, to form webs of personal relationships in cyberspace.
He identified that the technology that makes virtual communities possible has the potential to bring enormous power - intellectual, social, commercial and political - to the citizens, but this latent technical power must be used intelligently and deliberately by an informed population. He considered the online subculture to be "like an ecosystem of subcultures, some frivolous, others serious." Some of the serious communities united programmers, who used the network to exchange information and programs - mostly related to the network itself - aiming recognition from their peers.
hackers were already used to the time-sharing systems networks as
a communication medium. With the networks, it was easier to exchange
programs and routines, and the university walls were not barriers
as before. The community would then span to other academic centres,
and to other countries. Due to fate, or to the anarchic tendencies
of academic ecosystems, the networks started to be organized in a
5.3.2. Peer networks and cooperative computing
Initially, the term peer-to-peer (P2P, or simply peer ) described a protocol, application, or network where every node had equivalent capabilities and privileges, being able to initiate or complete any supported transaction. Beyond the technical definition, the term started to designate decentralized virtual communities where every individual participated in the same level, obtaining information with the same access rights, and sharing this and new material with other network members. Bauwens abstracts the peer-to-peer concept to other levels, like politics and spirituality, and even suggests the hypothesis of a new civilization format based in P2P .
Bar and Borrus suggested that based on the two elements already discussed - ubiquitous computing and a coherent infrastructure - a new computing paradigm emerged in the 1990s, shifting from simple linkage of computers to "cooperative" computing . Mimicking the cooperation between different companies, and based on peer-to-peer anarchic structure of organization, the hackers started to organize themselves, initially simply letting spontaneous and informal communication flourish at the same time, generating reciprocity and support by the dynamics of sustained interaction.
5.3.3. May the force be with the hackers
The world-wide web was built on the contribution of the hacker's culture of the 1970s. The group of researchers at CERN led by Tim Berners-Lee and Robert Cailliau relied on the hypertext concept created by Ted Nelson in the 1970s , used hacker technology like UNIX and TCP/IP and distributed their software free over the Internet. Richard Stallman and Linus Torvalds gave the initial impulse to the new-hackerism movement, by creating GNU and Linux. As important as GNU and Linux technological features, are their sociological ones. They are in the core of the modern hackerdom activities. They started the projects alone, and once their concept was stable enough to be understood by other people, they used the Internet as the communication media to form P2P task forces, by opposition to having a project team - sponsored by governments or companies - working in the same space. They had a centralization role, needed to guarantee the integration among the different software parts, but the community was still in the power.
According to Eric Raymond, "Linux is subversive. ( ) I believed that the most important software ( ) needed to be built like cathedrals, carefully crafted by individual wizards or small bands of mages working in splendid isolation. ( ) Linus Torvalds' style of development - release early and often, delegate everything you can, be open to the point of promiscuity - came as a surprise. No quiet, reverent cathedral-building here - rather, the Linux community seemed to resemble a great babbling bazaar of differing agendas and approaches (aptly symbolized by the Linux archive sites, who'd take submissions from anyone) out of which a coherent and stable system could seemingly emerge only by a succession of miracles."
5.3.4. The motivated and ethical hacker
The most difficult, for people outside the hacker communities, is to understand what are their motivations. According to Himanen , the hacker ethic may be divided into the work ethic, the money ethic and the network ethic. By the work ethic, he explains that the hacker activity must be joyful, enthusiastic and passionate, while performed in an individualistic rhythm of life.
The money ethic states that the main hacker objective shall be the recognition from the peers, and through the "capitalism hackerism", one can take part in the traditional capitalism only temporarily (until have enough capital to dedicate exclusively to "have pleasure") or on a part-time basis (working for a traditional company during the day, developing free software during the night). In addition, most of the hackers who established companies to earn money from free software and open source don't see any problem in selling software or services, once the work ethic is followed.
The network ethic (or nethic) preaches that a hacker should always try to practice the freedom of expression, respect privacy and stimulate self-activity - "the realization of a person's passion instead of encouraging a person to be just a passive receiver in life ( ) very different to the traditional media"
"A hacker who lives according to the hacker ethic on all three of these levels gains the community's highest respect. This hacker becomes a true hero when she or he manages to honour the final value ( ): creativity - that is, the imaginative use of one's own abilities, the surprising continuous surpassing of oneself, and the giving to the world of a genuinely valuable new contribution" .
5.3.5. May the force be with the hackers
David Stutz, The man formerly responsible for Microsoft's anti-open source strategy, attacked in February 2002 (just after his retirement) Microsoft's PC-based strategy, which he argued as misguided in a computing world where complex networks are more important than single devices. He maintained that the internet, the web and open source software projects, in which communities of programmers contribute improvements which are distributed free, are all part of the steady advance of networked computing.
Stutz suggested that Microsoft needs to focus on building a layer of software that integrates network technology. But that layer should not be an operating system like Windows, which is tied to PC technology. "To continue to lead the pack, Microsoft must innovate quickly," he said. "If the PC is all that the future holds, then growth prospects are bleak."
5.3.6. Lingua Franca
The hacker communication tools are the e-mail, newsgroups, chats and, later, the weblogs. The contents are around new programs, tools, routines, problems, and updates. The language was often English. It changed, after the spread of cheap web access in other countries, and then voluntary translations for programs, web sites and documentation flourished - again in exchange of recognition. This phenomenon helped to expand the hackerdom borders, by creating hacking sub-networks speaking regional languages, even if the common language among different local communities was English. To some analysts, the exchange of information via written medium could give an impulse to the recuperation of the constructed and rational discourse. What finally happened, on the contrary, was the stimulation of a new form of language, expressed by the electronic texts and richly completed by funny symbols, weird acronyms and multimedia.
Another important factor is the asynchronous communication favoured by the e-mails and newsgroups. This allowed each developer to work when convenient, with no time obligations, exactly as preached by the hackers ethic.
The need for privacy on the net started to attract the public opinion in 1999 during the menace, from Intel, to create a processor identified by a unique number, known as PSN (Processor Serial Number). Although this practice is current in mainframes - software licenses are often validated by comparing an encrypted key with the computer's serial number - its implementation in the personal computers could be the foundation of a vast tracking system that could help accumulate data on users as they travel around the Web, violating their fundamental right to privacy. The outraged privacy advocates launched a boycott of products containing the Intel Pentium III chip, the first such broad-based boycott of a product over the privacy issue . And they won the battle. After a letter from the American government , Intel finally decided to disable the PSN feature .
Another long-term battle is against the abusive usage of cookies , to trace the virtual footsteps of online users. The main advantage of cookies is the addition of a simple, persistent, client-side state, which significantly extends the capabilities of Web-based client/server applications. Without this type of persistent applications, it is virtually impossible to securely transfer the user information between two web pages, functionality required by commercial web sites. Nevertheless, sometimes the collection of information is excessive, with consumer habits being monitored by marketing companies and stored into databases, later used in aggressive advertising actions.
The original cookie definition, by Netscape, had several flaws, avoiding the user acceptance of the cookie execution. The IETF prepared a new proposal, containing a privacy section to enforce the need for this acceptance . The newest releases from Microsoft explorer implement part of those suggestions.
A famous case is the EPIC against DoubleClick. "EPIC (Electronic Privacy Information Center) filed a complaint with the Federal Trade Commission on February 10, 2000, concerning the information collection practices of DoubleClick Inc., a leading Internet advertising firm, and its business partners. The complaint alleges that DoubleClick is unlawfully tracking the online activities of Internet users (through the placement of cookies) and combining surfing records with detailed personal profiles contained in a national marketing database. EPIC's complaint follows the merger of DoubleClick and Abacus Direct, the country's largest catalog database firm. DoubleClick has announced its intention to combine anonymous Internet profiles in the DoubleClick database with the personal information contained in the Abacus database."
Important is to notice that all discussions about cookies, and the discovery of the privacy issues, are due to the openness of the HTTP protocol, needed for the interoperability between the HTML language and the browsers. If similar initiatives were taken under proprietary environments, everything could remain secretly hidden for a long time.
|Full Document- PDF (2.5 MB)|
|Full Document - HTML|