Monday, 5 March 2007

Module 1: Concepts (Extracts from NET11 text)

23. Human-computer interfaces

The Internet was originally designed to enable humans to instruct computers to act at a distance. The Internet still has this capacity; thus, we begin to see the emergence of a cultural sensibility in which the hard and fast distinctions between humans and computers as different kinds of ‘communicating devices’ breaks down. Telnet and similar functions on the Internet are different to programming and interacting with a computer on your desktop because you can’t physically see the computer at the other end of the connection.

This phenomenon has been exploited, for fun and research, by artificial intelligence programmers and language program developers. In one famous case, a ‘bot’ (robot) called Julia was developed (essentially a sophisticated program) that could, via IRC, fool people for at least a little while into believing it was a real human. It has been said that this example proves more about the lack of communication skills of humans than the abilities of computers.

The Internet lessens the recognition of difference between humans and computers because, at a distance, it is often feels similar to communicate and act on the Internet regardless of whether one is speaking with a human or a machine.

Interacting with websites feels similarly impersonal or, more subtly, further indicates the extent to which humans readily accept the presence of machine-like ‘intelligence’ in their lives.

24. Client-server two-way interactions

File transfer protocol differs most clearly from hypertext transfer protocol (HTTP) – the basis of the World Wide Web – by the fact that it can arranged very easily as a two-way exchange of information in any number of file formats. While many FTP operations only ever take place in one direction (from the server to the client), there are some publically available FTP activities in which clients upload as well as download files. Sometimes this is used for public exchange of information (often for illegal activities, such as trading pirated software between people); more often it is so that a single person can, effectively, manage files on more than one computer (for example, their desktop machine and a server or public machine).

The end result is that, at its heart, the Internet remains a two-way street. Files can be sent as well as received. Moreover, while more often used for limited groups or individuals, FTP remains a public utility, enabling individuals to ‘publish’ material that can be taken, on demand, rather than (as in sending email attachments) being ‘pushed’ at the recipient (in the traditional broadcast model). We see here that FTP is an asynchronous technology: I publish a file and, days, weeks, years later, the recipient can go and collect that file. But it is also a client-driven technology: FTP does not deliver…it fetches.

File transfer protocol remains the best example of how the Internet enables files to be sent to and from clients, at their initiation, thus emphasising the local autonomy of the individual user, and the arrangement of ‘cyberspace’ into publically accessible and changeable regions.

Two-way communication is the essence of the Internet and there are multiple options for doing it. Almost every form of communication (email, chat, etc) enables users to send large ‘information’ chunks back and forth. The great advantage of the Internet is that, at base, all data is data and so any program which establishes a two-way connection between yourself and another computer/user will – if properly exploited - provide a range of communication options.

25. Identity and location

The absolute, fundamental foundation of the Internet – one which must be maintained at all costs – is a system of identification and location, the creation of fixed, known ‘end points’ at either end of the complex routes taken by packets of data carrying all the information makes up the Internet. Without this fixed system, which must be managed in such a way as to be both usable and expandable (a technical term for this is ‘scalability’), the Internet would not work.

At a technical level, the identity and location system that enables data packets to be routed to and from computers (usually via servers, thence routers, to other servers, thence to personal machines) can assist users in understanding why the Internet seems ‘slow’ or ‘fast’ at certain times and in certain conditions. It could, in some cases, assist users to choose between one or other ISP, or web server. This kind of knowledge, allied to a reading of the ‘names’ in the system, can help users to understand the ownership and control of the Internet and the way it functions as a business system. But more profoundly, the ‘system’ of Internet identity and location suggests a growing change in people’s understandings of the themselves – marked, for example, by the difference between a ‘dynamic’ IP address that changes every time one is online and a ‘static’ address, available to people who run web servers or more expensive fixed, permanent Internet connections.

Advanced Internet users understand the technical system of the Internet, principally its numerical addressing and word-based naming overlay and the way data passes between points in this system. They also understand that this knowledge can assist them in managing their Internet use, and in recognising new cultural developments around the creation of identities that exist in part in physical life and in part in the virtual world.

Since communication via email or chat or ICQ can occasionally involve unwanted attentions, or misdirected messages, or outright harassment, advanced users learn how to recover key information about location and identity from their communications programs to assist in preventing these activities.

27. The persistence of history

When we consider certain applications, such as telnet, ftp and some of the more arcane ‘management’ tools such as the ability to finger, lookup and so on (so-called net tools), it may appear that, in the age of the World Wide Web, internet telephony, AV conferencing online and so on, these are old-fashioned irrelevancies. However, they are not. These early applications continue to have value, directly, and moreover, advanced Internet users understand them because they provide a sense of history and context which can assist in developing new capabilities for Internet use. Furthermore, the ideas that underlie these technologies are critical and continue to govern the fundamentals of Internet use.

Advanced Internet users inquire into and analyse the kinds of applications available over the Internet, even if they do not regularly use them, so as to learn lessons about past developments and to anticipate potential new developments, based on the meaning of those applications.

Moreover, while new systems ‘appear’ different, they often use or include much older, traditional applications. For example, various identifier commands (ping, traceroute etc) can be used within IRC; telnet and ftp are tightly interlinked with http for web browsing.

No comments: