Tuesday, May 13, 2008

The Truth About Wikipedia

This is a 48min video that discusses the power of Wikipedia; and how so many users have come to rely on a website that employs only five people (as mentioned in the previous blog) and is funded purely by donations and subsidies. The video includes interviews with the founders of Wikipedia and discusses how Wikipedia's success is due to the fact that it embraced Web2.0.

Wikipedia Vs Encyclopedia Britannica

An expert-driven study of Wikipedia claimed that Wikipedia was just as accurate as Encyclopedia Britannica.

The study, which was published in the journal Nature, has named Wikipedia as a good source of accurate information. This is despite the common misbelief by many people that because the information is not sourced from credible authors, it lacks general accountability and thus has no place in the world of serious information gathering (Terdiman, 2005). Wikipedia, which is a free, open-access encyclopedia lets anyone create and edit information; providing accurate information by a community made up of thousands of volunteer editors. It is this passionate community of contributors from all around the world who wield control over Wikipedia, completing all the maintenance tasks and often rectifying incorrect information within minutes of its being submitted.

I found this study by Nature, particularly interesting and somewhat ironic. This is primarily because in university, when producing an academic text, referencing Wikipedia is completely prohibited. This is due to the fact that, as stated by one of my law lecturers, “It’s completely controlled by just a couple of guys in California, how could we possibly trust it?” This is a very common point of view, owing to the fact that Wikipedia’s content can be corrected by anybody with access to the internet while Encyclopedia Britannica’s content is completely in the hands of nineteen full-time “professional” editors (Britannica, 2008). These two different styles of forming content both have their strengths and their weaknesses.

Wikipedia is not the free-for-all that many paint it to be. Its articles are primarily edited by a group of a few hundred volunteers; thus representing a traditional organisation much like that which maintains Britannica except on a larger scale (Wales, 2007). This means that the information is ultimately overseen by a select group and subject to the inadequacies of its members. An example is Kim Dabelstein Petersen, the person responsible for editing the Wikipedia information on global warming who was later revealed to be opting to delete out any content that alleged that climate change was not a proven theory (Marohasy, 2008). In order to ensure that her content retained her points-of-view, Petersen would patrol her articles and de-edit any corrections (Solomon, 2008). This goes to the show that the site which is renowned for its information freedom, is subject to the same sort of biases by its editors as any other encyclopedia.

Encyclopedia Britannica submitted a 20-page paper to Nature, disputing the findings in their comparative study; claiming that “Almost everything about the journal’s investigation, from the criteria for identifying inaccuracies to the discrepancy between the article text and its headline, was wrong and misleading.” The paper felt that the study (which had gone on to be widely publicised in the media) was based on such poorly carried out research that all of its findings should be deemed invalid. In an act of defiance, Wikipedia began a page consisting of “Errors in the Encyclopedia Britannica that have been corrected in Wikipedia;” which attempts to prove the advantages of an editorial process where anyone can edit at any time.

This rivalry between information outlets will ensure that encyclopedias will endeavour to provide accurate information, by keeping each other in check and ultimately providing high-quality material.

Saturday, May 10, 2008

The Power of the Online Community

A clip of trends analyst Dr Patrick Dixon, explaining the importance of online communities. He believes that online communties such as tripadvisor and wikipedia represent, "The death of branding. The death of agressive marketing. And the birth of revelation. The birth of information."

It is the end of traditional advertising as we know it.

Friday, May 9, 2008

The Inherent Problem with Second Life

There is no better example of an online community than Second Life. The website describes itself as a 3D virtual world in which people can create characters (whom are referred to as Avators) and can interact much like we do in real life. They can create businesses, buy and sell land, even create music and open museums; all the while using a currency (Linden Dollars) which can be transferred into legal tender. The virtual world’s popularity is extraordinary, with over 13 million accounts currently registered. This number is rapidly growing due to the community’s ability to expand and evolve through the use of free and open source software. However, the virtual reality website is beginning to have some serious real-world problems; including child pornography distribution, tax-free commerce and illegal online gambling. These problems that Second Life are encountering are due to the fact that it’s creators don’t want to alienate hard-core users by placing limits on the community’s uses. Furthermore, it is becoming increasingly difficult for the creators to retain control over all of the community’s features, with many of it users creating and altering the program’s applications to suit their needs.

In May of last year, German investigations revealed trading groups and members were utilising Second Life as a platform to exchange child pornography photos (BBC News, 2007). The investigator whilst undercover as an avator in the game, was invited to a virtual meeting within Second Life, in which child abuse photos were distributed and child pornography opportunities were discussed. The investigation also uncovered “Age Play” groups in which avators could sexually abuse virtual children. Second Life’s popularity is due to its “open-mindedness,” allowing people to interact more freely than they would in the real world. However, it is this lenience towards Second Life’s uses that allow atrocities such as this to happen. By allowing people to freely upload any material within private transactions, Second Life has made it easy for people to trade illicit material including sexual abuse and bestiality (Terdiman, 2006). Due to the fact that one of the site’s biggest drawcards is animated sexual interactions between user’s avators, the site can not restrict virtual child sex abuse, without risking losing users by restricting the former.

Ultimately however, Second Life can be a useful tool, and like any other online community it has its benefits. It provides us with the opportunity to pursue business enterprises, idea-share, find people with like interests and to explore our own humanity through less-than-human renditions of ourselves (Taran, 2007). Furthermore, just like any online community it allows people to transcend geopolitical limitations and interact with those that interest them.

The problems associated with Second Life highlight many of the issues that can be associated with any online community. Virtual communities allow freedom and accessibility never previously possible on other media platforms. However, with this freedom can come misuse. Ultimately, providing people with the tools necessary to interact freely, will always mean that there will be people who exploit these tools to the detriment of society.

Wednesday, May 7, 2008

How do Communities Evaluate Quality?

The quality of any citizen journalism project reflects the contributions of those who choose to participate, and such projects can be havens for triviality or unreliable information (Educause Learning Initiative). Many users are inclined to trust material they find online, particularly if it is called "news." Subsequently, citizen journalism has the potential to implicitly validate content that might not be accurate, offensive, or lack credibility. Therefore it is important that consumers of citizen journalism should understand that however well intentioned a citizen journalist might be, reading the news with a skeptical eye is a good practice.

Is Citizen Journalism Just Journalism?

Citizen journalism, as further described in the video in my previous blog can be summed up as simply the act of citizens playing an active role in the process of collecting, reporting, analyzing and disseminating news and information (Wikipedia). The intent of this participation is to provide independent, reliable, accurate, wide-ranging and relevant information that a democracy requires.

James Farmer wrote in his blog for The Age, that he believed that Citizen Journalism is simply journalism. He said, "How difficult is it to collect, report, analyse and disseminate? Easy huh! Just aggregate information through something like RSS, or go to the game or event or be on the spot when something happens. Then you've just got to "report" on that, which also couldn't be simpler, could it not? Just write some copy or select the video or photograph and package it, not a problem. Analysis? Even easier: just a case of saying what you think about stuff. And the dissemination part? Easy peasy: simply develop and design a website, or join a website where people are doing similar to the above and that's that.But guess what? If you've just completed any or all of the above, you're no longer a citizen, you're a journalist. You're investigative and on the scene, putting together balanced, objective articles on the events or selecting media to illustrate them, applying your analytic skills and then distributing the final product through what is commonly called a "news and information site" (previously known as a newspaper)."

Farmer bases this argument on the belief that many of the major "citizen journalism" websites employ journalists to edit the masses of information that is submitted. Examples of sites that Farmer uses include Newsvine, Digg and OhMyNews. However, he concludes that there is a revolution, that this revolution is positive but it is simply not citizen journalism. "Let's have sites that are built on citizen media and far greater and more worthwhile interaction between readers, journalists and editors. It's a riveting and powerful development in the world of online news, information and entertainment, but it's not citizen journalism and nor will it ever be."

Farmer's arguments are not particularly well-founded and there are many opposing views that are available. But what his arguments do manage to do, is bring forth another, more important question; what is a journalist?

What is Citizen Journalism?

This video provides an in-depth definition of Citizen Journalism. The video explains it's importance, and the problems it has created for long-established media sources, such as newspapers.

Tuesday, May 6, 2008

The Importance of Open Source Software

I have found articles that go beyond simply explaining the elements of Open Source Software, and go one step further by discussing it's importance.

David A. Wheelers's article "Why Open Source Software?" provides in-depth quantitative data and arguments as to why Open Source Software is the superior choice to proprietary software. The paper provides a balanced argument by examining market share, reliability, performance, scalability, security and total cost of ownership.

Arguments made by Wheeler include the following:
  • The most popular web servers have always been Open Source Software (OSS)
  • Based on a software program's capacity to resist crashing and freeze-ups, OSS has been found to be more reliable and less vulnerable to problems.
  • OSS has been found to be less likely to contain defects than proprietary software.
  • In one comparison in performance, Linux was found to be faster than and outperform Microsoft. Samba is capable of handling four times the amount of clients as Microsoft. Linus has superior drive configurations to Microsoft.
  • 60% of the world's fastest supercomputers use Linux.
  • OSS's are cheaper to acquire, have lower maintenance costs, do not impose license managements costs, have smaller hardware costs, etc.

Ultimately, Wheeler makes a convincing argument for the superior nature of Open Source Software. He does this by drawing conclusions based on numerous data studies, creating a essay with its foundations lying in fact.

Thursday, May 1, 2008

Understanding Web 2.0

This video was utilised in a lecture to provide a visual explanation for the evolving nature of the internet. It also relates to the topic of how "Web 2.0 is different to Web 1.0."

Wednesday, April 30, 2008

How is open source work (as an example of community produsage) different from commercial production?

Open Source

A basic definitioin for "open source" (also sometimes referred to as "free software") is the development method used for many pieces of software, where the source is freely available for anyone to work on, or modify, or learn from, or use in other projects (ArmLinux, 2008).
The idea behind open source is that any programmer can utilise and build on the software for free. One of the main benefits is that many developers use, modify and build on the source code so the software evolves very quickly (Butterfly Internet, 2008). Open source is a development method for software that harnesses the power of distributed peer review and transparency of process. The promise of open source is better quality, higher reliability, more flexibility, lower cost, and an end to predatory vendor lock-in.

Some consider Open Source as a set of principles and practices that promote access to the design and production of goods and knowledge. Some consider open source as one of various possible design approaches, while others consider it a critical strategic element of their operations. Before open source became widely adopted, developers and producers used a variety of phrases to describe the concept; the term open source gained popularity with the rise of the Internet, which provided access to diverse production models, communication paths, and interactive communities (Wikipedia).

Examples of Open Source:
According to the Open Source Initiative the distribution terms of open-source software must comply with the following criteria:
  1. Free Redistribution
  2. Source Code
  3. Derived Works
  4. Integrity of the Author's Source Code
  5. No Discrimination Against Persons or Groups
  6. No Discrimination Against Fields of Endeavor
  7. Distribution of License
  8. License Must Not Be Specific to a Product
  9. License Must Not Restrict Other Software
  10. License Must Be Technology-Neutral

Commercial/Proprietary Software

Proprietry software is in some ways the complete opposite to Open Source. It is basically any computer software with restrictions on use or private modification, or with restrictions judged to be excessive on copying or publishing of modified or unmodified versions. The term proprietary software is thus the antonym of free software, generally speaking. These restrictions are placed on it by one of its propietors. Other terms used to describe this concept include "closed-source software" and "non-free software" (Wikipedia).

These restrictions are enforced by either legal or technical means, or both. The most common form of technical restriction is by releasing programs that are only computer-readable (for example, in binary format), and withholding the human-readable source code. Means of legal enforcement can involve copyright (with a restrictive software licence) and patents. The source code of such programs is usually regarded as a trade secret by the owner. Access to source code by third parties commonly requires the party to sign a non-disclosure agreement.

Examples of Proprietary Software:

Wednesday, April 16, 2008

How is Web 2.0 different to Web 1.0?

On the most basic level;
Web 1.0 includes most website in the period between 1994 and 2004.
Web 2.0 is the state of the World Wide Web, following Web 1.0.

However, these differences go far deeper. According to one writer the ten key differences between Web 1.0 and Web 2.0 are:
1. Open standards Base: Ensure service connectivity is reliable
2. Ubiquitious Broadband: The infrastructure is now available to support web 2.0 models.
3. Less investment required: Companies can get far without a massive investment meaning companies can quickly be incubated to spread the risk.
4. Better Browsers: New format support, RSS etc enriches the user experience
5. Powerful development environments: AJAX is young but powerful and holds the promise of being easier to use compared to J2EE
6. Device convergance: Ability to access the web from a multitude of devices means on-demand services are more functional for real everyday use.
7. More Innovation: The de-skilling of the technological requirements mean more people get involved in trying to create, often from a more creative user-base.
8: Change in Use: The focus of the web and web 2.0 is firmly on usefuleness and in many cases commercial basis.
9. Maturity: Resiliance and Scalability are easier to provide with cheaper hardware and better understanding of how to achieve this.
10.History: Lessons from the dot com crash are not easily forgotten…
(Jana Techonology Services, 2006)

At the Technet Summit in November 2006, Reed Hastings, founder and CEO of Netflix, stated a simple formula for defining the phases of the Web:
Web 1.0 was dial-up, 50K average bandwidth, Web 2.0 is an average 1 megabit of bandwidth and Web 3.0 will be 10 megabits of bandwidth all the time, which will be the full video Web, and that will feel like Web 3.0.” (Hastings, 2006)

According to Wikipedia, "Web 2.0 is a trend in the use of World Wide Web technology and web design that aims to facilitate creativity, information sharing, and, most notably, collaboration among users. These concepts have led to the development and evolution of web-based communities and hosted services, such as social-networking sites, wikis, blogs, and folksonomies."

The short answer, for many people, is to make a reference to a group of technologies which have
become deeply associated with the term: blogs, wikis, podcasts, RSS feeds etc., which facilitate a
more socially connected Web where everyone is able to add to and edit the information space. The longer answer is rather more complicated and pulls in economics, technology and new ideas about the connected society. To some, though, it is simply a time to invest in technology again—a time of renewed exuberance after the dot-com bust.

However, Sir Tim Berners-Lee (the inventor of the web) has an opposing view on the concept of Web 2.0. When asked in an interview for a podcast, published on IBM’s website, whether Web 2.0 was different to what might be called Web 1.0 because the former is all about connecting people, he replied:
"Totally not. Web 1.0 was all about connecting people. It was an interactive space, and I think Web 2.0 is of course a piece of jargon, nobody even knows what it means. If Web 2.0 for you is blogs and wikis, then that is people to people. But that was what the Web was supposed to be all along. And in act, you know, this 'Web 2.0', it means using the standards which have been produced by all these people working on Web 1.0."
(Anderson, 2007)

Saturday, April 12, 2008

Online Communities

ONLINE COMMUNITIES
What are they and how do we organise them?

Online communities consist of what are usually like minded people who come together online to participate, debate and share information. These online communities (sometimes referred to as virtual communities) are not however just about communicating with people globally with whom you have the same interests, but is also about staying in-touch with real-life friends and acquaintances. These virtual relationships are maintained effectively through the use of social software such as topical sites, chatrooms or even interactive virtual worlds (Dibben, 2008).

The website for an online community can serve several purposes:
- provision of information about the community and how to participate.
-hosting of the tools of communication and conferencing; and
- knowledge management for the community- providing ways to organise relevant information contributed by the community and history of the community
(Dibben, 2008)

The majority of the difficulties that lie with organising an online community are due to it's "uncontrollable nature." As stated by one freelance technology consultant in an interview, "an organisation has to deal with a community that is more difficult to control than one built in real-life, since members can interact between each other quite easily, even organise themselves independently." (Cutrupi, 2006)

Ultimately, the way to successfully organise an online community and ensure it's success and longevity is to follow the following steps:
- build personal relationships among community members
- develop an active, passionate core group
- create forums for thinking together as well as systems for sharing information
- make it easy to contribute and access the community's knowledge and practices
- create real dialogue about cutting edge issues and information
- focus on topics important to the community
- find a well-respected community member to coordinate/facilitate the community
- make sure people have time and encourage participation
- get key though leaders involved
(Dibben, 2008)

Online communities are a great way to allow people from varying backgrounds, contribute ideas on common interests and ideals.