Friday, February 6, 2009

Conceptual Research & Reflection Project

Index
Concept 1
Concept 2
Concept 3
Concept 4
Bibliography


8. The Invisibility of Difference

"The daily practice of electronic communication is shaped by over-familiarity with one's own computer system, and a tendency to assume that – as with much more established forms of communication – everyone is operating within compatible and similar systems. When in doubt, seek to communicate in ways that are readable and effective for all users, regardless of their particular systems." (Allen, n.d)

Today there is less inconsistency between machine capability, software specifications, Internet connection and user experience than 10 years ago. Now though, the Internet offers users unprecedented opportunities for communication with those both near and far, yet the very process of communicating electronically can still be a very isolated experience. Many users tend to be cocooned within their personal and private domain, primarily familiar with only the equipment that surrounds them.

Because of this many users often don't think about technical aspects or capability of the user at the other end of the pipe. They may have outdated equipment or software, use a completely different operating system or not have adequate skills and understanding to complete the task. It is this ignorance or failure to acknowledge that Allen calls the invisibility of difference (Allen, n.d.)

To experienced users who deal with electronic communication on a day to day basis, these aspects of the divide that can exists between users is all to apparent and they will take such things into account when communicating with others. On the other hand novice users are so bewildered by the technology and new experience that they are in the questioning phase of their journey and as such will be directed by those around them. Still though there is also a large proportion of what I shall call intermediate users who get into a space where they are comfortable with what they do online. As such they forget of possible constraints on other users and fall into the mindset described by Allen.

It is these intermediate users who can amplify the poor experience accounted to the failure to understand the difference between users. An example of this type of intermediate user and their impacts can be found on the popular social networking site MySpace. A large section of MySpace users will have graphic heavy, music laden personal pages where those with a slow Internet connection or less powerful machine will have trouble loading the page. This of course is not limited to MySpace alone with many other social networking sites and amateur websites that suffer similar problems.

While the Web has always been developed as resolution independent, much research and discussion has gone into this invisibility of difference (Allen, n.d.) that has led to many changes on the Internet. The more recent of these changes can be seen with advent of what can be described as Web 2.0 design. Particularly the narrow centred column initially optimised to a width of 800px now to 1024px to take into account the average users screen resolution. Changes have not only been to screen resolutions, many website developers are aware of the differences in access speed and have moved to reduce the graphics heavy layouts of sites. By minimising graphics, developers are allowing sites to load faster and have a lesser effect on those with slower connections.

It is all these changes, along with the continuing standardisation of browser technology, which is helping the web move towards a smoother and more consistent user experience in the future.

Site 1: Web Design From Scratch http://www.webdesignfromscratch.com/
Written in an easy to follow casual manner, Web Design from Scratch is an excellent resource for web designers of all levels of experience. Ben Hunt utilises his years of experience with web design and takes the user through a range of techniques and concepts in order to help designers build sites that are both eye catching and economical at the same time. His approach to web site construction is one that directly takes the "invisibility of difference" (Allen, n.d.) into account and ensures that designers build their sites to accommodate the varying range of user experience.

Site 2: MySpace http://www.myspace.com/
MySpace is the popular social networking site for youths and adults alike. One of the popular features of MySpace is the ability for the user to customise their personal page as they see it. This feature has allowed even the most basic of Internet users to become web designers ultimately ending up with a large proportion of pages being bloated by video, audio or graphics. This highlights how one user may not take into consideration how their machine capability or connection speed differs to that of another users ultimately ending in a poor experience for others.

Return to Index


26. Privacy and Security
"The Internet is a profoundly ‘open’ system and advanced Internet users are cautious about either accepting or sending material from and to unknown sources and are careful in releasing information about themselves in any form. Conceptually, the Internet challenges us to take greater responsibility for the protection of privacy and security than perhaps we are used to when dealing with the media." (Allen, n.d)

With the advent of social media on the Internet the concept of privacy and security has been completely turned on its head. People now seem to be more and more willing to give up snippets of personal information to online social networking and media sites than ever before.

From network building, to blogging, to tweeting, to sharing video and photos, users can now upload their lives to the Internet in minute detail. Social media now encourages users to share all of who we are; opinions and thoughts, school and work history, relationship status, likes and dislikes.

Sharing all this information online offers us an unprecedented convenience of communication. In this new age users are able to look at friends and families photos instantly and keep track of what they are up to on a near hourly basis no matter where we, or they, are in the world. Yet it this very convenience and openness that also presents many potential issues and threats to the online community.

This plethora of information can allow a person to profile another, building up a detailed dossier of information to the point that they can impersonate another online. A recent example of this is the experiment conducted by Shawn Moyer and Nathan Hamiel where the pair posed as security expert Marcus Ranum on the popular networking site Linkdin.

Through out the experiment Moyer and Hamiel were able to fool a number of known associates of Ranum's into believing the profile was genuine and as such build a more elaborate profile as the experiment progressed. As a result, this experiment highlighted that many users of social media and networking are prepared to take online profiles with very little proof (Moyer & Hamiel, 2008).

Another side to Moyer and Hamiel's experiment also highlighted an interesting concept through a feature of social networking sites allowing offsite content. This ranged from allowing HTML to be inserted into comments and the customisation of profiles with cut-n-paste code from potentially un-trusted sources. For the uninitiated or security unconscious, this type of action can result in malicious code being inserted into their profiles (Moyer & Hamiel, 2008).

These findings back up current thoughts that security and privacy control online, and in particular with social media need an overhaul. Many discussions have already begun with a focus on the further enhancement of profile data manipulation.

Sites such as Facebook all ready allow for limited profiles, this is where a user can control what a friend can see on their profile page or news feed. Many believe that this can go even further though allowing for the creation of multiple personas online to control that material which is presented to different types of networks. As reported by Sonia Arrison: Jim Dempsey at the recent discussion Privacy 2009: The year ahead, made the point people present themselves in different manners in different situations such as work and home, so why should this be any different online? (Arisson, 2009)

While no one organizations seems to have the answer yet to this most complex of problems, the end user can take at least this away now. As Internet users in a growing online world of social media and the sharing of personal information, there is a greater need for awareness of online security and self-management of personal information and privacy.

Site 1: The Office of the Privacy Commissioner http://www.privacy.gov.au/internet/
While there are a plethora of online privacy guides on the World Wide Web, for Australians with little web experience then the Australian governments privacy website is an excellent starting point for those wishing to develop a basic understanding of privacy issues. The site lists a number of known privacy and security concerns for the Internet novice as well as tools users have at their disposal for protection of their online privacy and security. The site is straightforward and gives good clear definitions of problems while offering practical advice for the user to protect themself online.

Site 2: Electronic Frontier Foundation http://www.eff.org/issues/privacy
The Electronic Frontier Foundation is one of the preeminent online organisations relating to amongst other things online privacy and security, and although localised for the United Sates, the site still holds a lot of relevant information for netizens from all parts of the world. In particular the EEF's privacy section is an excellent resource for those wishing to delve deeper into the very complex world of privacy and security online. It offers up to date news and information on current issues including rulings from various court cases and changes to privacy laws while providing opinion pieces and whitepapers on research conducted within the field.

Return to Index


31. Hypertext: links or structure?
"While the WWW depends on hypertext, most of it uses hypertext merely for navigation (as in the first kind). Individual documents and even sites generally look much like linear, paper-printed materials. But, the whole of the web is rather more like the loose, unstructured ‘hypertext’ of the second kind. This suggests that hypertext is about both linking in the traditional way, but more effectively; and about structuring in a completely new way, based on this technology." (Allen, n.d)

Hypertext as defined by the Merriam-Webster unabridged dictionary as a database format where related information to that being displayed can be accessed by the user through clicking on the highlighted text (Merriam-Webster, 2002). Although Ted Nelson coined the term in 1965, the actual concept of Hypertext extends back to 1945 and Vannevar Bush with his Memex concept where with a device comprising of photographic, electrical and mechanical elements would allow the user to make links between documents stored on microfiche (Bush, 1945).

From that initial concept, through to the invention of Hypertext Markup Language by Tim Berners-Lee in 1989 and the subsequent World Wide Web, the Hypertext concept has developed to become the Hypertext reality. Now it can be argued that much of the structure of a web page comes from its Hyperlinks and how they are organised.

In fact over time users have come to rely on a very set structure of Hyperlinks to help them navigate around websites and individual pages. Usability expert Jakob Nielsen has highlighted this through his ongoing studies that have shown how users have grown accustomed to this basic structure of a website, and as a result have expectations of where they will find particular elements including navigation flowing across the top of the page to contact us and about us links in the footer of a page.

As it is this navigational use and aspect of Hypertext that users are most familiar and comfortable with, it would seem then that the inherent nature of a Hypertext link or Hyperlink is to provide linear navigation around a web page or website. It could be said though; this conceptual linear structuring utilising Hypertext fails to take advantage of what Hypertext can truly offer.

The real ingenuity of Hypertext comes from the act of linking pages and referencing material on offsite pages, Hypertext allows the author to create an experience that lacks the comfortable linear structure and allows the readers discovery of knowledge and ideas to be more organic. It is this organic referencing aspect that gives Hypertext its real power. In recent years this particular linking practice has flourished, particularly through blogs linking to sites further highlight concepts or sites of interest in context with their own material.

This concept of linking still has its limitations, as pointed out by Ted Nelson in his Google TechTalk presentation, Transclusion; Fixing electronic literature. Nelson puts forward that currently Hypertext is only a one-way path. Links only go outward and as such when browsing a page it is impossible to see pages that link to it (Nelson, 2007).

To some extent this limitation of Hypertext is overcome with a Ping or Trackback where an acknowledgement is made from the linking site to the originating site. This is still mostly limited to blog sites and still does not offer the user any linking reference if browsing the outwards site. It can be argued though that this practise is a further step towards the greater underlying Hypertext structure of the Internet.

Site 1: Use It http://www.useit.com/alertbox/20050103.html
This particular article was written by Jakob Nielsen in 2005 and is part of his excellent site Useit.com. The article presents a number of interesting further concepts on the direction of Hypertext that still stand today. Of particular note is the concept of "Explicit Structure" (Nielsen, 2005) and the incorporation of buttons which allow a user to go straight to sections such as about us, contact us or the like. It is also concepts like this and "Fat Links" (Nielsen, 2005) and "Physical Hypertext" (Nielsen, 2005) which have the potential to drive the future of Hypertext and the World Wide Web.

Site 2: Hypertext: The convergence of contemporary critical theory and technology http://www.cyberartsweb.org/cpace/ht/jhup/decenter.html
While not directly writing about the World Wide Web, I find this section from chapter one of Landow's 1992 publication on Hypertext rings true with today's World Wide Web. Landow's description of Hypertext as an infinitely de-centerable and re-centerable system allowing the user to make their interests the de facto organising system (Landow, 1992), highlights the true non-linear nature and power of Hypertext. This concept fits with idea of the World Wide web and constant linking to external websites for annotated information of explanation of deep thought. In particular where a user can choose the path that they follow from site to site.

Return to Index


33. Information and Attention
"In the era of the ‘attention economy’, readers and users of Internet information must be carefully craft, in their own minds, the kind of metadata which will – almost instinctively – ‘fit’ with the metadata of the information sources they want, so that – in the few brief moments of initial exchange, when a seeker of information encounters information being sought, rapid, effective judgments are made that ‘pay off’ in terms of further reading, accessing and saving." (Allen, n.d)

The Internet is filled with a near infinite amount of information yet we have a finite amount of time or attention to which we can give it. As a result we must learn how to find the information on the Internet that we seek, both quickly and efficiently. As a result as we learn how to use the Internet we tend to learn how to sift through the metadata of websites that is presented to us simply through our browsing or the various tools available to us such as search engines.

Metadata is defined by the Merriam Webster dictionary as; "data that provides information about other data" (Merriam-Webster, 2002). This metadata can be obtained in various ways and come in many forms. Probably most obvious on the Internet is the metadata that is presented via a search engine.

Through the use of a search engine a user can find such things as site titles, descriptions and URL's. Even before a user has looked at a page they are deciding if this site holds what it is they are searching for. Users are summing up this metadata, seemingly arbitrarily to determine the worthiness before clicking through to the page.

Of course the process is not arbitrary, an experienced user will have an idea of what they are searching for. This gleaning of data provided by the search engine can allow users to gain a mental picture of the site in question and determine if future investigation is prudent.

On making the decision to further explore the data on the offered web page, the user is then again presented with further metadata to determine continued reading. This time it may not be presented as obviously as the search engine page yet it's still there. The layout and design of the page, the introduction to the page or article, pictures, any text that stands out as the user scans the page.

Even with the tools available to us today, management of this metadata is becoming more and more difficult. With an ever increasing amount of data available to us, Recently Google engineers Jesse Alpert and Nissan Hajaj stated on the official Google blog that Google had indexed one trillion pages (Alpert & Hajaj, 2008) and counting, and the probability that not every site owner or developer puts meaningful data into things such as titles or descriptions.

Web developers need to look to new methods of managing this vast amount of information and making the available information more relevant. Recently Google introduced an annotation and ranking system for registered users of their popular search engine called SearchWiki. Users can make private comments and notes for sites coming up in search results wile rating and ranking the results. All this has no actual effect on the users search results but can provide useful reference material in the future.

I feel Google needs to take this a step further and include a tagging feature to allow users to also tag search results for later use. End users could then choose to search with user tags on or off. This would then allow for a further refinement of searching through the available metadata on the Internet.

Site 1: The Official Google Blog http://googleblog.blogspot.com/2008/11/searchwiki-make-search-your-own.html
This blog post written by Google product manager Cedric Dupont and software engineer Corin Anderson further describes the features of Google's new annotated search function SearchWiki. The article also contains an embedded video that provides a great tutorial showing the user how this new tool is utilised within their searches. Personally I find this new social feature for searching metadata an amazing step forward, allowing the user to comment on, rank and also hide search results. As the amount of metadata increases in the online world then users will need more powerful and innovative ways of sifting through this data.

Site 2: Delicious http://delicious.com/
Delicious.com is a site that specialises in folksonomy or social tagging of bookmarks. Many people utilise Delicious for online book marking and the sharing of interesting links with their online networks. I feel though, Delicious can offer a whole lot more than just a repository for the fruits of ones Internet browsing. Based on the very fact that thousands of users daily track sites and pages which they then tag or annotate with further information can make Delicious a very powerful tool for searching vast amounts of data which has been further narrowed down for the user.

Return to Index


Bibliography
Alpert, Jesse & Hajaj, Nissan. (2008). Official Google blog: We knew the web was big. Retrieved January 14, 2009, from http://googleblog.blogspot.com/2008/07/we-knew-web-was-big.html

Arisson, Sonia. (2009). Will 2009 be the year of multiple digital identities? Tech news world. Encino: ECT News Network. Retrieved January 18, 2009, from http://www.technewsworld.com/story/Will-2009-Be-the-Year-of-Multiple-Digital-Identities-65768.html

Australia. The Office of the Privacy Commissioner. (n.d) Information Technology and Internet Issues. Retrieved January 18, 2009, from http://www.privacy.gov.au/internet/

Bush, Vannevar. (1945.) As we may think. The Atlantic. Retrieved January 15, 2009, from http://www.theatlantic.com/doc/194507/bush

Delicious (n.d) Retrieved January 14, 2009, from http://delicious.com/

Dupont, Cedric., & Anderson, Corin. (2008). The official Google blog: SeachWiki: Make search your own. Mountain View: Google. Retrieved January 14, 2009, from http://googleblog.blogspot.com/2008/11/searchwiki-make-search-your-own.html

Electronic Frontier Foundation. Privacy. (n.d) Retrieved January 18, 2009, from http://www.eff.org/issues/privacy

Hunt, Ben. (2003). Web Design from Scratch. Retrieved January 22, 2009, from http://www.webdesignfromscratch.com/

Hypertext. (2002). In Webster's Third New International Dictionary, Unabridged. Springfield: Merriam-Webster. Retrieved January 15, 2009 from http://unabridged.merriam-webster.com/cgi-bin/unabridged?va=hypertext&x=0&y=0

Landow, George P. (1992). Hypertext: The convergence of contemporary critical theory and technology (11-13). Baltimore : Johns Hopkins University Press. Retrieved January 17, 2009, from http://www.cyberartsweb.org/cpace/ht/jhup/decenter.html

Metadata. (2002). In Webster's Third New International Dictionary, Unabridged. Springfield: Merriam-Webster. Retrieved January 14, 2009 from http://unabridged.merriam-webster.com

Moyer, Shawn (Speaker)., Hamiel, Nathan (Speaker)., & Peters, Sara (Presenter). (2008). TechWebTV: Black Hat 2008; Satan is on my friends list [Vodcast]. Las Vegas: Black Hat. Retrieved January 18, 2009, from http://au.youtube.com/watch?v=2lGKzHYBXtQ

MySpace (2003) Retrieved January 22, 2009, from http://www.myspace.com/

Nelson, Ted (Speaker). (2007). Google TechTalk: Transclusion; Fixing electronic literature [Vodcast]. Mountain View: Google. Retrieved January 15, 2009, from http://au.youtube.com/watch?v=Q9kAW8qeays&feature=channel_page

Nielsen, Jakob. (2005). Reviving Advanced Hypertext. Retrieved January 16, 2009, from http://www.useit.com/alertbox/20050103.html

Return to Index
blog comments powered by Disqus