Documente Academic
Documente Profesional
Documente Cultură
Associate Professor Kate Crawford, Journalism and Media Research Centre, UNSW Professor Catharine Lumby, Journalism and Media Research Centre, UNSW
Author detAils
Associate Professor Kate Crawford is the Deputy Director at the Journalism and Media Research Centre at the University of New South Wales. She is an internationally recognised researcher of internet technologies, and recently conducted Australias largest study of mobile and social media use by 18-30 year olds, funded by the Australian Research Council. Catharine Lumby is Professor of Journalism and Director at the Journalism and Media Research Centre at the University of New South Wales. She is the author of seven books and numerous book chapters and journal articles, and is an international expert on media and gender studies. She has been awarded five Australian Research Council grants and was a member of the Advertising Standards Board.
Acknowledgements
James West and Hannah Withers provided invaluable research assistance in the preparation of this report and we would like to thank them for their careful work. We also thank Peter Coroneos, Chief Executive of the Internet Industry Association, Peter Leonard, a partner at Gilbert + Tobin, David Simon former member of the Classification Board and Dr Peter Chen of the University of Sydney for contributing their expert advice in the preparation of this report. We also acknowledge Google Australia who provided a contribution towards the research funding for this report.
tABle oF contents
Author detAils Acknowledgements tABle oF contents executive summAry introduction section 1: the stAte oF PlAy 1 1 2 4 8 10
1.1 The Rise of Convergent Media 1.2 More, Faster: Australia Over The Next Five Years 1.3 Regulating The Convergent Environment: The Current Picture
Regulatory Inconsistencies Case Study 1: Facebook and the Queensland Government Case Study 2: Games, Online And Offline
10 12 13
14 15 17
Conclusion
section 2: the internAtionAl ArenA
18
19
19
21
21 23 23 24 24 25
26
26 27 27 29
30
31 32 33 33 34
34
34
36
2.2.2. The Internet Governance Forum (IGF) 2.2.3. The Organisation for Economic Co-operation and Development (OECD)
37 37
Conclusion
section 3: towArds A new Policy FrAmework
37
39
39 40
40
40 41 43 44
45
45 46 47
49
50
4.1. Recommendations
APPendix one: the history oF AustrAliAs regulAtory environment
51
53
executive summAry
The purpose of this report is to consider the following questions: How can Australia position itself to cope with the risks and opportunities presented by our current media environment? What principles would guarantee a flexible and balanced 21st century system of media content regulation in an evolving media ecology? What challenges lie ahead in adapting existing regulatory approaches to media silos - vertically defined media regulation for television, film, newspapers, radio - to a convergent media environment? What happens if we consider how content moves horizontally - moving between radio and television to podcasts and YouTube, to mobiles and tablet devices? What is the role of users in defining our media environment: from flagging problematic content, to shaping platforms, and contributing to media policy? What can Australia learn from international approaches to convergent media content regulation? What role should governments, industry and users play in media governance? How can we facilitate dialogue between nation-state governments, industry regulation, user communities and international laws? This report visits these issues in detail through a consideration of the history of online media governance, a comparison of international approaches, a series of case studies that highlight current challenges in managing online content and a consideration of where the regulatory balance should fall between government, industry and user communities. In considering these issues, we acknowledge that the current media environment poses extraordinary new challenges for governments, industry stakeholders and media users. While we canvas a broad range of issues and approaches to managing these challenges, we recognise the complexity of adapting existing regulatory approaches and of ensuring that people are given information and resources to enable them to navigate this new media landscape. In preparing this report we welcome the announcement of the Federal Governments Convergence Review by the Minister for Broadband, Communications and the Digital Economy, Senator Stephen Conroy. We present our findings as a research-based contribution to this process and its remit: to take a fresh look at Australias existing regulatory frameworks with a view to modernising them.1 The communications sector in Australia now reaches across an unprecedented array of sectors in the private and public spheres. Communications technologies are the backbone of our health, education, government, finance and culture sectors. The information revolution is critical to Australias economic competitiveness and ongoing social and cultural development. Yet, Australias laws have not kept up with this technological evolution or with the changes in the diverse modes of media production and consumption.
http://www.minister.dbcde.gov.au/media/media_releases/2010/115
Australia is moving away from the legacies of a vertical media environment, in which different networks such as telephone networks and radio networks were regulated and operated for clearly distinct purposes, to a convergent network environment. When we use the term convergence, it is referring to this collapse of borders between various media silos, where content can easily move horizontally across platforms. In this new horizontal environment, it is critical to pay attention to the different roles played by networks, platforms and content providers. Different regulatory and governance solutions need to be examined in each instance, rather than bundling together old media legislation. At the network layer, we argue, policy makers should focus on ensuring network openness, innovation and user choice. At the platform and content provider layers, government should work with industry and users, including in global fora, to encourage self-regulation while facilitating referral of genuinely disturbing material to national and international government regulatory instruments and agents. Community education about internet use, online security and legal obligations should be a priority in this area. There needs to be ongoing commitment to researching international approaches, emerging tools and community expectations.
Concerns about the production and distribution of harmful or criminal material in the convergent media environment have dominated much of the public debate about new technologies and platforms. This is understandable given that protecting the community, and particularly children, from inappropriate material has been and continues to be a core principle of media regulation and content management. This focus on harmful material, however, has often come at the cost of a broader appreciation of the benefits of the convergent media environment. The influential UK Byron Review2 noted that public policy and regulation that is genuinely and empirically grounded in an ethic of care for children and young people will fail if it relies too heavily on a simplistic block and control strategy. In Australia, the focus of debate on internet filtering has come at a cost of thinking about the wider media environment, leading us to ignore broader issues of care and user participation. In reconceptualising media content management and regulation it is critical to recognise that our networked media era offers unparalleled opportunities for innovation, entrepreneurship and the growth of knowledge, and that it has the potential to extend these opportunities to all Australians. Traditionally, Australian media content regulation has worked in a top down manner. Governments regulate or require media content providers to cooperate with them in co-regulatory or self-regulatory schemes. In this framework everyday media users had only a small capacity for direct input through complaints mechanisms or through the judicial system. This framework was based on a model of media audiences as largely passive bodies of consumers, with little need to interact with content producers or regulatory authorities. In contrast to this model, contemporary media users are not just consumers they are highly active, and are often media producers and distributors. Within social networking services that host extraordinary quantities of data it is users who are the most likely to identify offensive material and to notify the relevant host or government agency. In this report we suggest that it is critical to see government, industry and media users as key stakeholders who must work together in the future governance of media content. By cooperating, the three groups can increase opportunities for the identification of truly harmful material and the enforcement of criminal law. They can share the responsibility of governance of media content in an era where the sheer volume of material outstrips the capacity of any government or corporation to pre-vet all material.
Byron Review. 2008. Safer Children in a Digital World. The Report of the Byron Review: Department for Children, Schools and Families, and the Department for Culture, Media and Sport.
Convergent media governance must take the full spectrum of stakeholders into account from the end-user to the parent, from the school into the wider community, to industry and government. A key plank of this cooperation is the need for government and industry to educate consumers and provide them with resources to work in online communities to identify problematic content and to notify relevant organisations or authorities. Media literacy is vital. Education about opportunities and risks online is a particularly critical component of any strategy that aims to protect children, as well as maximise the potential for innovation and creative engagement. At a global level, it is important that Australia, as a robust democracy, inclusive society and cultural innovator, takes a leading role in furthering international cooperation between industry, governments and media user communities. The convergent media environment describes a globally networked media ecology in which no single national government or industry group can work alone to manage or regulate content. This report explores models for international cooperation, which we argue will be central to future governance schemes. In the convergent media environment, traditional media platforms, content and audiences now coexist with their new media counterparts. Its an environment marked by an unprecedented diversity of users and usage which spans the spectrum from amateur videos made for friends and family to high-end professional content produced for international consumer markets, all circulating on the same platforms. This diversity does not render the core values that have underpinned Australias approach to managing and regulating media content redundant. On the contrary, we argue that it is time to think carefully about how those core values can inform a set of principles and approaches to content management in the 21st century: that we need a fresh and adaptive approach to ensure that we balance the opportunities and complexities posed by the convergent media era. We argue that these ten key principles should underpin a new system of media content regulation in Australia:
1. An adaptive approach. We are part of a rapidly evolving media landscape, where media are not silos but are closely intertwined. Our policy frameworks have not kept pace. It is essential that Australia conducts a first principles review of media content policy and develops a flexible system. 2. Working together. The regulation and management of convergent media is best done by industry, government and end-users sharing approaches and concerns and acting collaboratively. 3. Recognising layers. In the convergent environment, there are distinct layers: the infrastructure level of the networks, then platforms and content. Governance solutions need to be attuned to the different issues at each level, while keeping the network layer open and interoperable. 4. Rethinking content. Content is not media specific but fluidly moving through multiple spaces, being repurposed and recirculated. Frameworks for acceptable content are more effective and have fewer chilling effects than network level filters. 5. The importance of users. Public policy and law must recognise the critical role that user communities now play as participants and creators of online environments as well as in notifying industry and government of offensive content. 6. Consistency. All states and territories should have a uniform approach to the sale, distribution and possession of prohibited or restricted content. 7. A new Classification Scheme. We strongly support the review of the Classification Scheme. It is time for a purpose-built system for a 21st century media environment, relying on comprehensive and empirical research into community standards in relation to media use.
8. Media literacy. A national plan needs to be developed, based on empirical research, to foster digital literacy. Industry, government and community members should participate in the formulation of a plan, which will include community education about safety and security on the internet. 9. Committing to the big picture. A broad set of national principles need to be developed that support technology neutrality, a commitment to the free flow of information, the protection of vulnerable users, and an innovative digital economy. 10. An international perspective. Australia needs to be an active participant working with other governments and industry within international fora on media governance issues.
introduction
The historical principles, relevant to the scope of this report, that have traditionally informed media content regulation in Australia include: The need to promote self-regulation and competition in the communications sector while protecting consumers and other users.3 Ensuring that adults should be able to read, hear and see what they want while minors are protected from material likely to harm or disturb them.4 The public interest in ensuring that news and current affairs media provides balanced, objective and accurate information.5 Regard for community standards in relation to material that condones or incites violence, particularly sexual violence, or portrays people in a demeaning manner.6 The need to ensure a balance between local and international media content and to ensure culturally diverse content.7 The protection of privacy.8 The need to balance the right to free speech with the responsibility of government to protect citizens from content deemed harmful.9 The protection of national security.10 The Productivity Commissions report on broadcasting contains a clear summary of the values that underpin media content regulation and management in Australia: Important social and cultural objectives of broadcasting policy include ensuring diversity of sources of information and opinion, adequate levels of Australian content and appropriate program standards. Freedom of expression is also important and should be added to the objectives of the Broadcasting Services Act 1992. Diversity of sources of information and opinion is most likely to beserved by diversity in ownership of media companies, and by competition. 11 The report also notes that controlling the potentially harmful consequences of media influence must be weighed against the benefits of independent and open media in a democratic society and that as new media proliferate and media organisations converge with other businesses, regulatory restrictions on freedom of expression will have an increasingly important place in media law. Media content is currently regulated through a combination of direct regulation (laws, government regulatory bodies and licences), co-regulation (industry-based codes of practice with government approval and potential sanctions) and self-regulation (industry endorsed codes of practice). This model privileges two actors:
3 4 5 6 7 8 9 10 11
ABOUT BCAST
About Communications and Media Regulation, The Australian Communications and Media Authority, www.acma.gov.au/WEB/STANDARD/pc=PUB_REG_ Classification (Publications, Films and Computer Games) Act 1995. See industry codes of practice at The Australian Communications and Media Authority, HTTP://www.acma.gov.au/WEB/STANDARD/pc=IND_REG_CODES_ The National Classification Code. See The Future for Local Content? Options for emerging technologies, 2001, Australian Broadcasting Authority, available at http://www.acma.gov.au/webwr/_assets/
Productivity Commission, 2000, Report into Broadcasting, available http://www.pc.gov.au/projects/inquiry/broadcasting/docs/finalreport, viewed 15/4/10. Ibid. See Productivity Commission, 2000, Report into Broadcasting, available http://www.pc.gov.au/projects/inquiry/broadcasting/docs/finalreport, viewed 15/4/10, pp. 332-333; Robert Albon and Franco Papandrea, Media Regulation in Australia and the Public Interest, November 1998, Current Issues; Ibid.
government and industry. It reflects an era in which media consumption was dominated by the production of messages by the few for the many. While media users have historically had some role in media content regulation in Australia through the capacity to notify regulatory agencies or media organisations of their concerns about content, their role in regulation has been limited. In a media environment characterised by the rapid growth of online and mobile media in which media users are often media producers, and where the distinction between these activities is increasingly blurred, the potential role of media users in regulation assumes greater importance than it has ever been historically accorded. It is an era in which the one-to-many model of media content production and distribution has fundamentally and permanently altered. In the space of a decade, our media environment has transformed into one where average Australians actively produce and distribute their own media content through blogs, social networking sites, and videos uploaded to platforms such as Facebook and YouTube. The technological means to produce sophisticated media content have been domesticated and the distribution channels are also growing. We live in a rapidly evolving media environment and one in which technological and business models are volatile. It is apparent that industry and government stakeholders are still coming to economic, cultural and political terms with this global media environment. Users are increasingly challenging existing models of consumption and production, and of regulation. This rapid evolution requires us to think carefully and critically about the focus and scope of media content governance. An important example here is the issue of ensuring a balance between local and international media content. Historically, local media content was promoted and protected by systems that sought to find a balance between industry development and commercial considerations. As the Productivity Commission noted in a 2000 report on broadcasting policy: barriers to entry are balanced against programming obligations.12 In the convergent media environment, both the sources of media content and the means of distribution and consumption are multiplying exponentially. For example, while there remains an important role for governments to play in promoting and assisting the development of local media content and content creation, it is clear that the conventional framework for addressing concerns about maintaining the profile and diversity of local content do not automatically apply to online spaces. This report aims to put the convergent environment into context and to generate broad principles to guide the necessarily more detailed debates about the applicability of historical principles which have underpinned media content policy in the past. At present, Australian approaches to internet content regulation are still based largely on traditional media regulatory models and assumptions. The internet is not a new form of media: it is a new media environment where media users have unprecedented agency in consumption and production. We note that, unlike other nations and transnational groupings notably the UK and the EU, the risks and opportunities of online media consumption and production have not, to date, been the subject of broad rigorous empirical research to guide government in its approach to media content governance and management.
12
Productivity Commission (2000), Broadcasting, Report no. 11, Canberra: Ausinfo, p. 254.
13 14 15 16 17 18 19 20
ACMA (2009a), Communications Report 2008-09, pp 15-16. See NBNCO: http://www.nbnco.com.au Convergence Review Announcement, December 14, 2010. http://www.dbcde.gov.au/digital_economy/convergence_review ACMA (2010), Communications Report 2009-2010, report 2, p.13 Nielsen (2011), Media Release: Nielsens State of the Online Market: Evolution or Revolution?, http://au.nielsen.com/site/documents/ The Nielsen Company (2011), Nielsens State of the Online Market: Evolution and Revolution?, http://au.nielsen.com/site/documents/ The Nielsen Company (2010b), Australia Getting More Social Online as Facebook Leads and Twitter Grows, March 23, available http://blog.nielsen.com/nielsenwire/ The Nielsen Company (2010c), Nine million Australians now interacting via social media sites, Media Release, 15 March 2010, available http://www.nielsen-online.
10
phone capable of accessing the internet.21 Relaxed cap plans offered by service providers have seen mobile social networking soar in recent years. A quarter of self-described social networkers now do so on their phones, as well as on their home and work computers.22 Many online spaces have applications (apps) specifically designed for mobile phones. Facebooks iPhone app, for example, uses iPhones camera and gallery, enabling users to snap a picture and upload it to their wall for friends to instantly see and comment on, on the run. A similar app exists for YouTube. Mobile phones, in concert with social media services, have acted as windows onto a range of key international crisis events, including protests, natural disasters and terrorist attacks. This is an example of what researchers Axel Bruns and Mark Bahnisch call the hyperlocal benefits of social media trumping professional content makers: [] local participation may be harnessed to report on local events or record local histories, or to capture local insider knowledge which is available only to long-standing members of the (offline) local community. Here, particularly, there is also an important role for the use of mobile devices to capture such information on the spot and virtually in real time such uses range from the use of Flickr or Twitter to report high-profile events such as the 2005 London bombs or the 2008 Mumbai attack through to comparatively more mundane activities such as sharing information about traffic jams, potholes, restaurants, or travel destinations.23 The functionality of mobile devices will continue to collapse categories of communication. An always on culture of mobile internet use in Japan, discussed in Section 2 of this report, has been described as creating an ambient virtual co-presence24. Ubiquity, always-on connectivity, context sensitivity and the central role of the users identity make mobile internet devices powerful media tools. Ralph Schroeder argues: there will continue to be denser, more extensive, more time-consuming and more non-location-specific ties.25 Australian researchers have analysed this always on culture as a driver of online media cultures, finding that with the increase of mobile devices comes opportunities for different gradations of social connectedness.26 Mobile phones have also created a geo-mobile web, where a massive trove of information about location is tagged to RSS feeds, web pages and comments to social media sites, which is transforming the relationship between data space and physical space, creating new ways for users to create a sense of place or belonging.27
21 22 23 24 25 26 27
The Nielsen Company (2011), Nielsens State of the Online Market: Evolution and Revolution?, http://au.nielsen.com/site/documents/ The Nielsen Company (2010b), Australia Getting More Social Online as Facebook Leads and Twitter Grows, March 23, available http://blog.nielsen.com/nielsenwire/
Bruns, A. & Bahnisch, M. (2009), Social Media: Tools for User-Generated Content: Social Drivers behind Growing Consumer Participation in User-Led Content Generation, Smart Services CRC, Volume 1 State of the Art March 2009. Ito, M. and Okabe, D. (2005), Technosocial situations: emergent structuring of mobile e-mail use. Ito M, Okabe D and Matsuda M (eds) Personal, Portable, Pedestrian: Mobile Phones in Everyday Life. Cambridge, MA: MIT Press, 44751 Schroeder, R. (2010); Mobile phones and the inexorable advance of multimodal connectedness, New Media Society, 12; 75. Michael Bittman, Judith E. Brown, Judy Wajcman (2009). The Cell Phone, Constant Connection and Time Scarcity in Australia, Social Indicators Research, Volume 93, Number 1 / August, 2009 Crawford, A. & Goggin, G. (2009), Geomobile web: locative technologies and mobile media [Paper in special issue: Placing Mobile Communications. Lloyd, Clare; Rickard, Scott and Goggin, Gerard (eds).] Australian Journal of Communication, v.36, no.1, 2009: 97-109.
11
Traditional media companies are also changing how they offer content to their audiences. Newspapers such as The Australian and the Sydney Morning Herald are offering mobile and iPad-based content. Television networks are developing stronger links between web and mobile content and traditional programming, offering mobisodes and alternative reality games (such as the TV series Lost, which used an extensive alternative reality game between seasons). New intersections between broadcast and participatory media are emerging. For example, the ABC has established a service called Pool, a social media space where ABC employees and audiences can collaborate and co-create content. In the background, Australians are changing the type and combination of devices that they use as a method of accommodating change. Australians are opting for multiple services, including traditional fixed phones, 3G, broadband and wireless (most adults use three forms of communications technology regularly).28 Some Australians are substituting old phone lines with mobile handsets, with subscribers to 3G networks jumping 44 per cent between 2008 and 2009. Young consumers are leading this charge, according to ACMA.29 The level of mobile uptake for Australians aged between 24 and 35 is at 95 per cent, the highest in the country.30 Fixed telephone line use dropped over the same period to just one in ten people over 14 subscribing. Moreover, as prices fall, companies are simultaneously offering more services.31 There is a growing, if tentative, use of VoIP, especially Skype. Two and a half million Australians accessed a VoIP service in the middle of 2009.32 Once the National Broadband Network is in place, this trend will markedly increase.33 This is the commencement of a wide-scale investment in a post-web internet environment, and the development of new technologies and social configurations.
household_consumers.pdf 3/24/10.
Ibid. 25 Ibid. Ibid. Ibid.
Scott Rickard. 2009. Are you there? Encouraging users to move from peer-to-peer to Voice over Broadband. Telecommunications Journal of Australia. 59 (3): pp. 42.1 to 42.7. It is worth noting that while funding models underpinning the National Broadband Network have been the focus of disagreement between the major parties in Australia, there is bipartisan support for the need to ensure better broadband access across Australia. IPTV and internet video delivery models: video content services over IP in Australia, ACMA, June 2010. ACCAN (2009), Future Consumer, p 5.
12
More-powerful mobile devices, ever-cheaper net-books, virtualization and cloud computing, reputation systems for social networking and group collaboration, sensorsand other small systems reporting limited amounts of information, do-it-yourselfembedded systems, robots, sophisticated algorithms for slurping up data and performingstatistical analysis, visualization tools to report results of analysis, affective technologies, personalized and location-aware services, excellent facial and voice recognition, electronic paper, anomaly-based security monitoring, self-healing systems... 37 The manner in which people communicate with each other on the internet and the way that they produce content, now and in the future, requires a rethinking of the way traditional media policies work. Institutional forms that have sufficed for regulating some of these functions in other media no longer work, writes Australian media studies scholar Sal Humphries. It requires a breaking down and revisioning of policy areas and strategies. It requires a new form of literacy in users, and the development of new skills and strategies.38
13
Regulatory Inconsistencies
The internet is a multifaceted, distributed network with no centralised gatekeeper. The vast range of communication options it contains were once governed by distinct policy areas. This raises two significant problems in relation to future regulation. The first is that existing media becomes digitised and is distributed differently, which brings the existing rules pertaining to each medium into question. The scarcity of broadcast spectrum, for example, resulted in strict transmission rules that are now challenged by the relative abundance of digital transmission. The second problem is that separate media silos are now interchangeable on different digital networks. Television content can be seen on TV, delivered online, and to a mobile. The service is the same, but the rules for each medium are different. Speaking from the US experience, communications scholars Franois Bar and Christian Sandvig write that policy responses to convergence end up being ramshackle and jerry-built, a story of evolutionary inertia and incrementalism; changes in media policy have rarely appealed to an underlying driving truth the why of regulation rather they are knee-jerk responses to existing regimes: As a result, policy-makers looking to resolve convergence challenges have favoured incremental adaptation of past rules rather than fundamental redesign of the policy regime. They have chosen either to treat a new medium with the policy previously applied to whatever it seemed to resemble, or to adjust through the accretion of exceptions and additions. Thus, policy treats cable television as an extension of broadcast, itself viewed as an extension of radio.40 A similar situation exists in Australia where state-by-state, platform-by-platform, content is regulated differently. Australia has yet to fashion laws that fully amalgamate broadcasting and telecommunications with an understanding of the ways the internet is changing the end users experience. Communications is poised to become truly cross-sectoral reaching across health, government, commerce, and of course, the media.41 Australias laws have not kept up with the technological evolution and are ill-prepared to absorb such rapid change. In the area of consumer protection, there has been little opportunity to understand the complexities of user experience, despite wider community concern about protecting children from inappropriate content. This is especially the case for social networking, user generated content, gaming and online immersive worlds. As the following case studies will show, there is now a distinct need for industry to work with government and users to build media literacy and to expand cooperation with one another. Social networking platforms exist in a grey area of regulation: they are constantly changing the types of content they offer combining ephemeral and stored content the great majority of which is user-generated. Services like Facebook and Twitter argue that they cannot be held legally responsible for any of the content, as they are not the publishers they just provide the platform. Laws designed to regulate traditional publishers are often illsuited to online services because they are based on traditional media production and use practices, according to which a proprietor can be assumed to take responsibility for published content.
The lifespan of online content is also relevant here. On the one hand, online conversations through social networking sites, bulletin boards, email lists and instant messaging services are transitory, and on the other they have the capacity to exist well beyond their intended initial time-frame and purpose, and can be mined for information and bought and sold commercially, even after the death of the original author.42 Current regulations are not able to respond well to situations that collapse the public and private spheres, as well as stored and
40 41 42 Bar, F., & Sandvig, C. (2008). US communication policy after convergence. Media, Culture & Society, 30(4), 531. Gerard Goggin; Claire Milne. 2009. Great expectations? Regulating for Users in the United Kingdom and Australia. Telecommunications Journal of Australia. 59 (3): pp. 47:12. ACCAN (2009), Future Consumer, p. 63.
14
transient content. Further regulatory inconsistencies exist beyond social networking services. Online games provide one such example. Although regulated in the same manner as any other game (by the Classification Board), online games are also subject to regulation by ACMA, since they are considered to be online media. It is possible to access X18+ videos in the ACT but not online from Australian-hosted sites. R-rated content in films is refused classification if the content appears in a game though at the time of writing this was under government review. Gamers could simply circumvent the system by buying R18+ games online from stores offshore or downloading them illegally. The possibility of inconsistent regulation between platforms is a serious concern for policy makers and legislators. Humphries concludes that, however daunting, there needs to be a new approach, one that can cater to the many ways in which people use the internet, to the various platforms for delivering content and to the manner and locations in which content is consumed, either publicly or privately: With more conventional media well-worn pathways of distribution made it easy to control dissemination of restricted content, and the limited amount of content published made it possible to implement reviewing processes. Because these processes and conventions have been disrupted by the new structures and practices of the internet and its users, new conventions need to be established in order to achieve a balanced set of protocols that take account of freedom of expression as well as community standards.43 Increasingly, social networking and user generated content are producing challenges for the current regime. Two case studies help to illustrate these challenges: Facebooks self-regulation, and the current lack of an R18+ rating for games in Australia. Each example provides insight into the contradictions and gaps in current regulation.
In response, Facebook wrote a letter to Premier Bligh, which included an explanation of privacy control and alert settings that can be used for informing Facebook about the existence of inappropriate content. 44 They also expressed their regret: These vandals have cast a shadow on an already tragic experience, and we are disappointed and disgusted that anyone would turn a tribute Page into anything but a place of respect and honour, wrote communications executives Debbie Frost and Elliot Schrage. Facebook argued that such content violates the terms of use. Facebook also took the unprecedented action of reaching out directly to Australian users to emphasise the ways in which Page and Group administrators (often those who set up the sites) can
43 44 Humphries, S. (2009), op. cit. Facebook letter to Queensland Premier Anna Bligh, 25 February 2010.
15
remove content themselves. Facebook said that it continues to develop industry tools that enable it to respond more quickly to what it calls the 400-million strong police force that our Facebook users represent. The executives concluded that the complete prevention of inappropriate content or eradication of tasteless material is not something we or any society can deliver. Facebook users are required to adhere to a Statement of Rights and Responsibilities45, including points 3.6 and 3.7: You will not bully, intimidate, or harass any user; and you will not post content that is hateful, threatening, pornographic, or that contains nudity or graphic or gratuitous violence. Facebook makes it clear that it can remove anything posted it believes violates this statement or terminate the service. Like many social networking sites, Facebook resists being defined as a publisher. As Facebook spokesperson Debbie Frost explained to The Australian in March 2010: We didnt build a site to be a publisher, we built it to be a platform. We built it to give people tools to share information with each other and I think the enormous success weve seen is testament to the fact that human people do want to do that and the vast, vast majority hundreds of millions of people are not behaving the way these few people did in Australia, so it seems to be going OK as a system. Frost admitted to some procedural slowness, however she argued that the problem is worse in the real world: If I put up nude graffiti on the side of a church, how do you report that, how do you get it taken down in a way thats good enough for you? It takes time in real life and on the web and we think our system is actually more responsive If I phone you up and say really offensive things, does that mean the mobile phone operator is liable for that? The report also stated that Facebook keeps all of its records, and would cooperate completely with any police investigations.46 This being the case, it could be argued that Facebook in fact presents a stricter regulatory environment than that existing in the offline world. While questions can be asked about Facebooks responsiveness to particular acts of vandalism, it nonetheless puts considerable effort into removing obscene and vulgar content that may actually be legal to host, out of deference to its community and the standards amenable to advertisers. To the extent that illegal acts may be undertaken by users on Facebook, these can still be investigated by relevant authorities. For example, Queensland police have conducted investigations into the vandalism case cited above, with a particular focus on claims that some of the obscene materials posted included images of child abuse. Additionally, the pages set up to vilify the alleged murderer of Trinity Bates may constitute contempt of court as this case is currently sub judice.47 Such intentional misuses of the platform present significant challenges to media regulators. Mandatory filtering and complaints-based approaches to media regulation would face precisely the same issues of responsiveness encountered by Facebook itself. Given that Facebook is an ever-changing user-driven environment, automated filtering systems are unable to correctly differentiate between pages displaying legitimate and illegal content. And a system built on complaints to a regulator would require impossibly large resources in order to evaluate submissions and assess content posted by users, even if security and privacy issues related to access to users personal content could be successfully overcome. Only an approach based on collaboration between the platform provider and its users, including rapid responses to user feedback, can adequately address questions of
45 46 47
16
the suitability of content on a site such as Facebook. Community interests need to be heard, a responsibility that privately-owned services need to take very seriously.
why-we-should-back-an-r18-classification-1025
McCrea, Christian (2011), Fair games: Why we should back an R18+ Classification, The Conversation, 19th April, http://theconversation.edu.au/articles/fair-gameFrom the campaign site, Australia needs an 18+ rating for video games: http://www.r18games.com.au/gta/
4 Parker, Laura. (2011), South Australia to introduce R18+ for games, GameSpot AU, 28 April, http://au.gamespot.com/pages/news/story. php?sid=6310534&skipmc=1, viewed 28 April 2011.
17
Online games such as World of Warcraft reveal an even more complex side of gaming classification. These networked games are doubly regulated under the Australian system, both as online content and as traditional games however neither system of regulation truly demonstrates a deeper understanding of the textual or community meanings of the games. Online games enable, and require, user-generated content and social networking to take place. Defying traditional ideas of a finished product, online games continue beyond any one users experience, with millions of authors constantly interacting and competing. Such networked production, argues Sal Humphries, highlights the need to approach such media not merely as texts, but more as dynamic sets of relations and processes: Moves to force this new genre of participatory media into the strictures of old conventions seem unwise, yet the power and influence wielded by established media interests mean policy and regulation continue for the most part to act to preserve the old rather than facilitate the new. The interests of users, now participators and producers, need to be thought about alongside those of corporate publishers, not only in terms of their access to cultural and social capital, but in terms of what their rights, risks and obligations might reasonably be in such a system.52 We agree that the current regulation of games in Australia is problematic, and at odds with similar countries. One pressing issue is to introduce a national R18+ rating to prevent games from being either miscategorised, and thus available to younger audiences than they were intended, or inappropriately banned from adult use.
Conclusion
In this section we have mapped the ways in which Australias content regulation policies are struggling to keep up with the current media production and consumption. Social networking and user-generated content operate with a different production model in mind: they are both dynamic and networked. Traditional policy silos that governed individual media forms are no longer relevant. The extant classificatory regime is subject to significant blind spots when it comes to platforms such as mobile internet, games, social networking and user-generated content. More sophisticated, flexible and broad-based 21st century principles of media regulation are required in order to assess the impact of new technologies, and how Australia might sensibly adapt its laws to better reflect the realities of a convergent media environment.
52
Humphries, S. (2009), Discursive constructions of MMOGs and some implications for policy and regulation. Media International Australia: Incorporating Culture and Policy (130). pp. 63-64.
18
53
Wilske, Stephan; Schiller, Teresa, International Jurisdiction in Cyberspace: Which States May Regulate the Internet, 50 Fed. Comm. L.J. 117 (1997-1998)
19
The table below contextualises Australias attempts to regulate the ICT market with international examples, and draws out concepts that can help built evidence for a best practice approach to regulation.
country
content Filter
Australia
No No overarching strategy; media literacy falls under the remit of ACMA; Digital Education Revolution (2009). Yes Japan Safer Internet Program legislates minimum standards for private industry in media literacy and education.
Not yet Mandatory/Government (Proposed) Current: voluntary consumer Family Friendly Filters administered by IIA and ACMA. Yes Voluntary/Industry Japan Safer Internet Program encourages private industry to develop content filters; operators tailor filters and parental controls themselves.
Japan
Yes u-Japan Policy (Ubiquitous Internet); Under this comes the Act on Development of an Environment That Provides Safe and Secure Internet Use for Youth; Japan Safer Internet Program. Yes Digital Strategy 2.0 (2008)
New Zealand
No
Yes Voluntary/Government Digital Child Exploitation Filtering System Yes Voluntary/Industry IWF maintains a list given to ISPs.
UK
No Complimentary self-regulatory approaches include Click Clever Click Safe: The first UK Child Internet Safety Strategy (UKCCIP) and ThinkUKnow (CEOP).
20
Tano, H. (Consumer Services Department NTT DOCOMO)(2009), DOCOMOs Filtering Service Initiatives, presentation to ITU/MIC Strategic Dialogue for Safer Internet Environment for Children Tokyo, Japan 2-3 June 2009, available http://www.itu.int/osg/csd/cybersecurity/gca/cop/meetings/june-tokyo/documents/
Shiotani, S (2008). Challenges Facing the Cable Television (CATV) Industry in an Effort to Create Survival Business Models, Keio Communication Review, No. 30, 2008, p. 51.
21
telecommunications companies: triple play (or bundled) services: telephone, TV over IP, and internet access.62 Despite these changes, Japan, like Australia, is facing the challenges of rapid convergence. There is no current regulation (apart from copyright law) for digital content designed for IP-TV, video on demand, or for catchup TV programs on the internet. There is also no policy for what the Japanese government calls terminal convergence (mobile TV phones or TV sets that can receive IP-TV). There is no competition regulation covering cross-ownership of broadcast and telecommunications by the same entity.63 Regardless, Japan continues to produce a remarkable number of innovations, in what has been dubbed Content Fusion Technologies: live TV-based chat during programs, integrated search engines for both recorded and web content, automatic conversion of internet news into video via wEE (Web2TV with emotional expression), and even a system that automatically converts web content into cartoons for kids (called Interaction e-Hon).64 In August 2006, the Ministry of Internal Affairs and Communication set up a Study Group to take an in-depth look into the policy gaps that existed between telecommunications and broadcasting law. The Study Group released its final report in 2008.65 It concluded that there is a need to undertake a fundamental revision of the legal system for communications and broadcasting. In response to technological convergence, the Study Group was guided by these principles: Free flow of information Universal service Maintenance of safety and reliability of information and communications networks Technical neutrality We argue that these are important principles to consider when devising Australias way forward. The report also recommends a shift to horizontal management of the ICT sector, which would enable free combinations of contents and networks. Instead of being applied along the lines of the distribution model (eg. phone lines, cable, internet), a new system would apply according to three layers: 1. Content: regulated in categories, for example Media Service, or Open Media Content like blogs and personal web sites. The regulations would cover user-generated content. 2. Transmission infrastructure: consolidating regulations on communications and broadcasting transmission services; maintaining flexible use of spectrum across all platforms. 3. Platform: social media sites, e-government sites, etc.
62 63 64 65
Seki, K. (Director, International Economic Affairs Ministry of Internal Affairs and Communications MIC)(2010), Network Paradigm Shift: Deployment of Ultra-high Speed Access, available at http://www.bundesnetzagentur.de/media/archive/5471.pdf, viewed 17/03/2010. Ibid, p 33. For a more detailed discussion of emerging technologies, see Miyamori, H. et al. (2007), Fusion of Communication Content and Broadcast Content, Journal of the National Institute of Information and Communications Technology, Vol.54 No.3, pp. 66-69. Ministry of Internal Affairs and Communications (Japan)(2008), Final Report from the Study Group on a Comprehensive Legal System for Communications and Broadcasting, Biweekly Newsletter, Vol. 18 No. 21, February 8, 2008.
22
Regulating Content Market deregulation in Japan has occurred alongside moves to regulate harmful content. In 2002, the Provider Responsibility Law encouraged online providers to remove potentially harmful content and in response, ISPs drafted guidelines for self-regulation. In 2008, the government released its Final Report on a Comprehensive Legal System for Communications and Broadcasting66, which acknowledged the need for a complete restructuring of the communications and broadcasting legislative regime, from a vertical hierarchy to a horizontal one. Although the report stated that for the present moment, the need to regulate the platform layer as a separate individual layer is not great because it is a newly evolving service, it noted that the government would need to keep watch on the evolution of the platform layer in order to ensure that it does not become a bottleneck. The report argued that there was a need for softer, less prescriptive regulatory measures, while opposing the view made by some parties that the regulation of the platform layer should be left entirely to the Anti-Monopoly Act and general competition laws. In 2011, early signs are indicating that the Japanese government is increasing its interest in regulating the platform layer, citing concerns that platforms are becoming too powerful. Overall, the restructuring of the regulatory regime is still being developed and debated. On the issue of child safety, Japan has the Act on Development of an Environment that Provides Safe and Secure Internet Use for Youth67, which came into effect in April 2009. The Act calls on all stakeholders to ensure safety online, starting with children themselves, and working up through content providers, retailers, manufacturers, ISPs, teachers, parents, community groups and the government. Japan also formulated the Japan Safer Internet Program in 2009. The government calls this a comprehensive policy package regulating illegal and harmful online content. Industry is obliged to provide parental controls and mobile internet filtering, tailored by companies to the age of the child, as well as being obliged to take measures aimed at educating the public about media literacy. Aims of the Japan Safer Internet Program68
1. Development of a Basic Framework to Provide Safety and Security Improving the basic legal system for a safer internet (e.g. encouraging filtering software) Promoting international cooperation Promoting actions with local public authorities Supporting public-private partnerships 2. Promotion of Voluntary Efforts in the Private Sector Tackling illegal and harmful information Exploring effective access prevention measures against child abuse material. Encouraging the dissemination of content rating Supporting technical development
66 67 68
Japan International Affairs Department, Telecommunications Bureau, 2008, Final Report on a Comprehensive Legal System for Communications and Broadcasting, available at www.soumu.go.jp/main_sosiki/joho_tsusin/eng/...21/Vol18_21.pdf, accessed 20/03/11. Government of Japan (Cabinet Office)(2009), Act on Development of an Environment That Provides Safe and Secure Internet Use for Young People (English translation), available http://www8.cao.go.jp/youth/youth-harm/pdf/neteng.pdf, viewed 15/3/15/10 APEC (2009), Japan Policy and Regulatory Update, 39th Telecommunications and Information Working Group Meeting Plenary Session, Singapore, 16 - 18 April 2009, available http://ec.europa.eu/external_relations/japan/docs/2008_japan_rrd_proposals_en.pdf, viewed 3/16/10.
23
3. Supporting Parent-Child ICT Media Literacy Information on moral education in the family, community, and schools Promoting user educational activities from third-party organisations Investigating the impacts of harmful information on children
Content Regulation = Self-Regulation In Japan, obscene online content is defined as including:69 Information about undertaking or mediating, or inducing a crime, or that induces a suicide; Obscene depictions of sexual conduct or genitals or other information that considerably stimulates sexual desire; Grisly depictions of murder, execution, torture or other extremely cruel content. It is important to note that the acts of rating content and instituting criteria for filtering of content have been left to the private sector. The policy does set minimum standards for parental controls for private industry70, and companies can then tailor products directly to consumers. In regards to illegal content, users are encouraged to report illegal and offensive content to the Internet Hotline Center, which is a member of INHOPE (the International Association of Internet Hotlines).71 Commencing in April 2011, a newly created organisation called the Internet Content Safety Association (http:// www.netsafety.or.jp) will develop a blacklist of all child pornography sites located by the National Police Agency, which will then be referred on to search engines, ISPs and filtering companies, who are expected to voluntarily block the content from their services. Mobile filtering works on an opt-out basis; internet filtering is opt-in. Mobile carriers are required to provide users under 18 with filtering services unless otherwise requested by parents. Parents who sign up for a mobile phone service are required to declare the age of the user when they enter into a contract with them. The opt-out system reflects the governments view that it is more difficult to monitor the use of mobile phones at all times, making children arguably more vulnerable to harmful content. Poster and Slogan Competition: Case Study Child protection strategies run alongside media literacy programs, through various government supported campaigns. Information Security Day is on the 2nd of February every year, kicking off a month of events. The Check PC! campaign included a website informing users how to enhance PC security, and was widely publicised on public transportation and in the media.
69 70 71
This summary provided by Aizu, I. & and Bayer, J. (2009), op. cit. Ouchi, K. (Deputy Director, Ministry of Internal Affairs and Communications Japan) & Isozumi, K. (Deputy Director, Minisry of Economy, Trade and Industry)(2009), Workshop on Initiatives in Promoting Safer Internet Environment for Children, APEC-OECD Joint Symposium on Initiatives among Member Economies Promoting Safer Internet Environment for Children, available http://www.oecd.org/document/17/0,3343,en_2649_34223_43301457_1_1_1_1,00.html viewed 3/15/10. See http://www.inhope.org/gns/news-and-events/news/10-11-22/Internet_Hotline_Center_Japan_news.aspx, accessed 23/03/11.
24
One example of a digital literacy program run in Japan for children is that of the Information-technology Promotion Agency (IPA), which operates under the Ministry of Economy, Trade and Industry. For children and students, the IPA runs an Annual Information Security Slogan and Poster Awards, which began in 2006 and was an adaptation of the Korea Internet & Security Agencys program. Industry support for the program comes from entities including Microsoft and Symantec. Its goals are to identify online threats and to learn how to avoid them. The posting of personal information, defamation and cyber bullying are recognised as the major online threats for children and students.72
Figure 3. You dont know that you are well-known. Be careful for personal data leaks. Winning poster entry 2008, by Wakana Kato, Grade 3 Toki Commercial High school, Gifu Japan. Response From Industry: NTT DoCoMo One objective of the Japan Safer Internet Program is to promote media literacy through third-party organisations. Evidence points to the fact that industry has met and exceeded the legislated minimum requirements to provide filters and educational initiatives73. One example is NTT DoCoMos Mobile Phone Classroom. Since it began in 2004, nearly one and a half million people have attended 9,200 workshops.74 The company has also converted the session to DVD to send to homes. Topics include safe mobile use, parental controls and online abuse. The Docomo Family Safety Hotline responds to questions and concerns regarding mobile phone use by children, including questions about potential trouble, phone etiquette and appropriate billing plans. The hotline received some 60,000 inquires in 2008.75
72 73 74 75
Yamada, A. (2009), Information-technology Promotion Agency, Japan Initiatives on Awareness Raising of Students / Children, at APEC-OECD Joint Symposium on Initiatives among Member Economies Promoting Safer Internet Environment for Children, April 15, 2009, available at http://www.oecd.org/ ACMA (2009), Online Risk and Safety in the Digital Economy op. cit. p. 51. NTT DOCOMO (2010), Addressing the impact on children, http://www.nttdocomo.co.jp/english/corporate/csr/report/safe_secure/social/kids/index.html,
accessed 3/15/10
25
76 77 78 79 80 81
Cunliffe, D. (2004), Digital Strategy proposes new directions for ICT, Media Statement, 11 June 2004. Newman, E. (2007), Address to Telecommunications Summit, Auckland, 25 June 2007, p 5. Keown, J. (2007), Digital strategy gets rethink, The Independent Financial Review, 27 June 2007, p 6. Cunliffe, D. (2007a), Digital Summit 2.0 Keynote speech at the Digital Summit 2.0, 28 November 2007. Cunliffe, D. (2007b), Address to 8th Annual Telecommunications and ICT Summit by the New Zealand Minister for Communications, Fast-forward to the Broadband future, Hyatt Regency, Auckland 25 June 2007. New Zealand Ministry of Economic Development (2008), The Digital Strategy 2.0, p 8. New Zealand Government (2008), New digital super group announced, Media Statement, 25 March 2008.
26
The Minister for Communications and Information Technology has decided that Digital Strategy 2.0 will remain in place as the outcomes remain relevant in the current climate. The actions that support the outcomes in the strategy are under review and some will change to reflect the new governments priorities. Announcements regarding changes to the actions will be made by the responsible Ministers, or you can contact the relevant agencies directly for updates.82 As the above notice states, the Digital Strategy has not been retracted or replaced. It remains in place as a statement of desired outcomes. Yet many of the projects it highlights as a means of achieving those outcomes have been discontinued. For example, the Digital Development Council83 and the broadcasting regulatory review84 were both discontinued in 2009. While the future of the Digital Strategy remains unclear, it remains as the best insight into New Zealands ICT goals. Confidence: An Integrated Response The New Zealand government acknowledges that more online access increases childrens vulnerability to online dangers, illegal content and bullying. New Zealands strategy looks at the risk of harmful content on children alongside other online threats to adults, including malware, viruses, identity fraud and security attacks. The strategy aims for a universal understanding of online safety, security and privacy issues. The strategy extended funding to two national programs, NetSafe through the Ministry of Education and the Digital Child Exploitation Filtering System through the Department of Internal Affairs. Consistent with the rollback of Digital Strategy initiatives, the additional funding to NetSafe was withdrawn after the 2008/2009 financial year.85 However, the Ministry of Education remains a strategic partner of NetSafe, and continues to provide some funding. Case Study: Hectors World NetSafe is an independent not-for-profit organisation that promotes confident, safe, and responsible use of cyberspace, with members drawn from across government, education, law, industry and the community.86
82 83 84 85 86
http://www.med.govt.nz/templates/StandardSummary____43904.aspx
Saarinen, J. (2009) Govt kills Broadband Investment Fund and Digital Development Council, Computerworld, 5 February 2009. Joyce, S. and Coleman, J. (2009), Government concludes broadcasting regulatory review;, Media Statement, 7 April 2009. Ministry of Education (2010), Annual Report 2010, p. 109.
http://www.netsafe.org.nz
27
One of NetSafes most successful programs is the award-winning Hectors World87, a series of online products that help children to become confident users of the internet. Hectors World is produced by Hectors World Limited (HWL), a charitable subsidiary of NetSafe. Hector, a dolphin, and his sea creature pals learn net basics in the ocean. UKs Child Exploitation and Online Protection Centre have also launched it across primary schools in Britain88. In 2009, ACMA purchased a licence to the program through its Cybersmart website (www.cybersmart.gov.au).
Using Flash animation, the user is guided through a series of episodes. Each episode is accompanied by detailed lesson plans, homework suggestions, print-outs and colouring-in exercises. The aim is for students to learn to only give their personal details to people they can trust. The episodes emphasise listening to and acting upon any uneasy feelings, and the importance of trusting adults. Some of the messages for students include: stop and think for yourself before acting; bad websites can look like legitimate websites, and they can deliberately make terms and conditions difficult to understand; if something looks too good to be true, it probably is; not every person you meet online is trustworthy; young people can help other people by keeping an eye out for others online, and listen to your uneasy feelings.89 Hectors World is an example of a program that has shifted away from simple net safety ideas to a digital citizenship model (see Figure 6). This shift away from programs that only address online safety reflects the perceived benefits of providing adults and children alike with the skills to participate in the digital economy and to take responsibility for their own behaviour90.
87 88 89 90
http://www.hectorsworld.com; a good discussion of Hectors World can be read in ACMA (2009), Developments in internet filtering technologies and other measures for promoting online safety. Second annual report to the Minister for Broadband, Communications and the Digital Economy p4, available http://www.acma.gov.au/ webwr/_assets/main/lib310554/developments_in_internet_filters_2ndreport.pdf, viewed 3/16/10. ?pageID=294§ionID=corporate&menuID=275, viewed 3/11/10. teacher_info_sheet_episodes_2_5.pdf, viewed 3/16/10.
ACMA (2009), op. cit. Hectors World Ltd (2008), NZs Hector Protector swims north to help UK children stay safe, Media release, 8 May 2008, http://www.netsafe.org.nz/keeping_safe.php Hectors World Lesson Plans for Episodes 2-5; Information for teachers of Years 0-6, available http://hectorsworld.netsafe.org.nz/wp-content/uploads/yrs_0_6_
28
Digital citizenship combines three skills to create competency in participating safely and securely in the digital economy:92 Digital etiquette: displaying appropriate and responsible behaviour while online; Digital literacy: proficiency to access, understand, participate in or create online content; and Digital security: which involves securing ones own personal information We argue that the model of digital citizenship, covering the three key platforms of responsibility, literacy and security, could usefully be a part of Australian pedagogy, and would give a strong basis of confidence for children learning to engage online. Content and Access Regulation: An Opt-in filter In New Zealand, possession and publication of obscene material is covered under the Films, Videos, and Publications Classification Act 1993. It is illegal to possess, make, trade, distribute or display a publication deemed objectionable by the Classification Office.93 Classification decisions are made by the Classification Office only; the Court is not responsible for determining whether a publication is objectionable or not. Responsibility for policing what happens on the internet falls to the Department of Internal Affairs (DIA), which is primarily concerned with very serious offences such as material featuring the abuse of children. The Digital Child Exploitation Filtering System is now operating in New Zealand.94 It is a narrowly defined internet filter using Swedish software Netclean Whitebox. Website requests are filtered against a blacklist held on a central server in the government Censorship Compliance Unit. The list is maintained by the Independent Reference Group under the Department of Internal Affairs, which actively reviews banned URLs each month. The filter is not compulsory, but most ISPs have indicated that they will join it, although there is no public record of which ISPs are using the filter.
91 92 93 94
Hectors World Limited 2010. Evolution of Hectors World Learning Objectives 2009-2010, available http://hectorsworld.netsafe.org.nz/wp-content/uploads/hwl-
ACMA (2009), op. cit. p 99. Films, Videos, and Publications Classification Act 1993, ss 131-132A. ONeill, R. (2010), New Zealands internet filter goes live, Computerworld, Stuff NZ, http://www.stuff.co.nz/technology/digital-living/3434754/New-Zealands-
29
According to the government, the list of sites that the system offers to block only includes child abuse materials; it is reviewed monthly, and has a clear appeals process outlined in a public Code of Practice.95 In the process of filtering, an internet users IP address is made anonymous, and data logs only kept for 30 days. As in Australia, the New Zealand Government characterises the filter as just one tool in making the net safer: The filtering system is a response to community expectations that the government and ISPs should do more to provide a safe internet environment. It is not a silver bullet that will prevent everyone from accessing any sites that might contain images of child sexual abuse, but it is another important tool in the Departments operations to fight the sexual abuse of children. Keith Manch New Zealand Internal Affairs Deputy Secretary, 16 July 2009.96 By focusing solely on child abuse material, the filter is designed to block only material that would be objectionable (and therefore illegal to possess) under the Films, Videos, and Publications Classification Act anyway. Any person who circumvents the filter to possess or trade publications that promote or support the sexual abuse of children will continue to be liable under this Act. It should also be noted that the filter covers only a subsection of objectionable material. Publications that promote or support other practices, such as bestiality and acts of torture are also objectionable but are not subject to the filter.97 The plan has attracted criticism from online rights groups like InternetNZ, fearing scope creep of the filter, and the overall effectiveness of the scheme, similar to criticisms of the proposed mandatory net filter in Australia: It risks leaving parents feeling that the Government is providing a safe environment, but it cannot deliver on that promise. The filter would only help at the margin, and child abuse material would still be available on the internet. The filter would disrupt the end-to-end connectivity that has made the internet the useful tool it is today. It creates some confidentiality concerns, and is not subject to all the usual lawful checks and balances that apply to all other parts of New Zealands censorship regime.98 The rollback of the Digital Strategy initiatives and the discontinuation of the Digital Development Council are, from the perspective of the principles which ground this report, regressive moves. While the current filter is optin and remains appropriately narrow in its scope, the focus of NZ government policy has moved away from a forward looking commitment to enhancing and resourcing user agency and digital citizenship. The extent of this policy shift remains to be seen.
30
were passed in 1973 and 1990, the latter of which created the Independent Television Commission. Two other agencies supervised compliance: the Broadcasting Complaints Commission and the Broadcasting Standards Council. Commercial radio was regulated separately by two different agencies: the Radio Authority, and the Radio Advertising Clearing Centre. Cable Television was regulated by the Cable Authority, created in 1984 as a result of the Cable and Broadcasting Act. The Cable Authority eventually merged with the Independent Television Commission. As a result, businesses with diverse broadcast and telecommunications interests had to report to several agencies, whose duties overlapped significantly. The complexity and redundancy of over-regulation, its cost to industry, and obsolete laws, forced the government to propose a unifying Communications Bill in 2003, which allowed for a converged regulator, Ofcom, to take responsibility for five agencies.99 Now, TV, radio, telecommunications and online content are all regulated under the one umbrella. The BBC, which remains subject to separate supervision from its governors, must now comply with Ofcom rules for the industry. The Communications Bill, however, does not supersede other relevant legislation in this area: the Wireless Telegraphy Act 2006, the Broadcasting Acts 1990 and 1996, and the Competition Act 1998. As the UK communications regulator, Ofcom oversees the wholesale and retail markets for all data networks. It also has a statutory duty to promote media literacy, and to manage online risks. Ofcoms annual communications market report100 shows Britains need for continual engagement in this rapidly changing sector. Thanks in large part to BBCs online iPlayer, which was receiving 70 million online requests per month in 2009, more than a quarter of households claimed to have watched TV programs online; that rises to a third of 15-24s.101 Catch up TV has been helped by the increased availability and take-up of broadband connections, wider access across computer platforms, heavy marketing campaigns, and distribution direct to television sets and gaming consoles. For example, Virgin Media, BT Vision and Tiscali TV now offer catch-up content directly through TV sets rather than through a computer. Digital Britain & Digital Economy Bill 2009-2010 Released in June 2009, Digital Britain102 was an attempt to synthesise and co-ordinate a consistent approach to Britains digital future, ensuring universal access to services and providing regulatory stability. The program outlines the liberalisation of spectrum for 3G, funding and investment for 3G networks, and plans to increase digital participation and improve digital skills, with a strong emphasis on self-regulation as a first practice. UK governance measures are also subject to and shaped by EU governance initiatives in this area, which are discussed in greater detail below. The former government sought to utilise Digital Britain as a means of promoting a range of industry efforts to increase trust and user confidence in the digital economy. Digital Britain does not explicitly outline online safety or user confidence programs. Rather, it reinstates support for changes made after the Byron Reviews Safer Children in a Digital World Report, which examined the effects of harmful content on young internet users. The Byron Review noted that a number of organisations work independently to develop and deploy online safety initiatives. It recommended one single approach. The UK Council for Child Internet Safety (UKCCIS) came into being in order to fit this brief: a stakeholder organisation with a focus on voluntary codes of conduct. The Council addresses online child exploitation through law enforcement, and education and awareness activities. Alongside UKCCIS, and part of the UK Police, is The Child Exploitation and Online Protection
99 For a more nuanced discussion of this shift, see Garca-Murillo, M. (2005), Regulatory responses to convergence: experiences from four countries, Journal of Policy, Regulation and Strategy for Telecommunications, 7, 1; p. 20.
101 See the BBCs figures on iPlayer: http://blobfisk.com/wordpress/wp-content/uploads/2009/12/bbc_iplayer-01.PNG, viewed 2/4/2011 102 Department for Business Innovation & Skills, Department for Culture, Media & Sport (UK) 2009. Digital Britain, available http://www.culture.gov.uk/what_we_do/
31
(CEOP) Centre, dedicated to eradicating the sexual abuse of children. CEOP runs the ThinkUKnow website. Since 2006, ThinkUKnow has reached over 4 million children and young people. In 20082009 alone, over 3,500 local professionals and industry volunteers were trained. Increasingly, UKCCIP and CEOP team up to produce and support each others campaigns and even host each others events and websites. The controversial Digital Economy Act 2010103 implements aspects of Digital Britain. The Act: Extends the role of Ofcom to include reporting on communications infrastructure and media content; Imposes obligations on internet service providers to reduce online copyright infringement via a three strikes and youre out policy, cutting off or degrading persistent illegal file-sharers internet connections; Allows the Secretary of State to amend copyright legislation to the same end without parliamentary consent the hotly contested Clause 17 has since been defeated. Commits to giving courts the power to block websites that are infringing copyright; Extends the range of video games that are subject to age-related classification; With the passage into power of the new government in the UK however the Digital Britain scheme has taken a backseat, with portions of the agenda, including the web blocking clauses of the Act, under review. UK Digital Content Self-Regulation Internet Watch Foundation Unlike Australia, Britain does not have primary legislation covering offensive content on the internet. Content deemed illegal under the Obscene Publications Act and the Public Order Act of 1986, the relevant offline legislation, is also illegal in the online context, however there is no specific legislation relating to inappropriate content for adults in the online world. In its place, the UK encourages a self-regulatory approach, employing the services of the Internet Watch Foundation (IWF), established for the purpose of eliminating images of child pornography hosted anywhere in the world, as well as criminally obscene and criminally racist content hosted in the UK. The IWF works alongside law enforcement agencies worldwide and operates a notice and take down procedure in relation to content on UK sites and a list of international child abuse sites that ISPs can block at the network level. The Internet Services Providers Association (ISPA), the key UK industry association, indicates that all major ISPs as well as the majority of smaller providers, have implemented the database. A critique of the IWFs strengths and weaknesses forms the basis for a discussion about future regulatory models in Section 3 of this report. The work of the IWF is supported by other self-regulatory initiatives. The ISPA has developed its own Code of Practice to deal with the issue of inappropriate content, which all of its members are expected to adhere to. This Code of Practice mandates that members must comply with take-down notices issued by the IWF and requires ISPs to provide relevant user details to the police. ISPA also urges its members to provide sufficient information about filtering tools to all of their customers.
103 http://www.parliament.uk/briefingpapers/commons/lib/research/briefings/snha-05616.pdf
32
Other Internet Content Regulation The Good Practice Principles on Audiovisual Content Information104 were developed to ensure that consumers are able to make informed choices about the content they access in a fast-moving media environment. They were launched in February 2008. Participants include AOL, BBC, Bebo, BT, Channel 4, Five, Google, ITV, Microsoft, Virgin Media, and Yahoo! Five of the principles that providers sign up to are: Promoting and enabling media literacy through the provision of content information; Offering content information in order to empower users and allow them to make informed choices about the content that they and their families access/consume/watch; Offering information about content that may be harmful or offensive to the general public, and that may be unsuitable for children and young people. In particular, content information is designed to enable parents and carers to exercise supervision over the content viewed by those they are responsible for. Employing editorial policies that reflect the context in which their content is delivered. While the exact format of the information may vary from provider to provider according to context, providers aim to present it in a way that is easy to use and understand. Mobile Content Self-Regulation Mobile companies have set up three self-regulatory bodies to deal with digital content that is deployed over their networks. The PhonepayPlus Code of Practice covers premium content (similar to Australias Telephone Services (Mobile Premium Services) Determination 2005). The Independent Mobile Classification Body responsible for setting a Classification Framework for certain new forms of mobile Commercial Content against which content providers can self-classify and provide age-based access controls.105 The classifications include advice on violence, sex, nudity, language, drugs, horror, and imitable violent techniques. With the increased internet functionality of mobile phones, and because the Independent Mobile Classification Body does not encompass the mobile internet, the Mobile Broadband Group has developed and updated its own self-regulatory codes: UK code of practice for the self-regulation of new forms of content on mobiles.106 The Code covers new types of content, including visual content, mobile gaming, chat rooms and internet access. The Code admits that mobile operators have no control over the content that is offered on the internet and are therefore unable to insist that it is classified in accordance with the independent classification framework. Mobile operators therefore offer a filter to the mobile operators internet access service so that the internet content thus accessible is restricted. The filter is set at a level that is intended to filter out content approximately equivalent to commercial content with a classification of 18+.
104 http://www.audiovisualcontent.org/BSG%20Good%20Practice%20Principles%20on%20Audiovisual%20Content%20Information%20One%20
105 Independent Mobile Classification Body (2005), IMCB Guide and Classification Framework for UK Mobile Operator Commercial Content Services, available at http://
106 Mobile Broadband Group (2009), UK code of practice for the self-regulation of new forms of content on mobiles, available at http://www.mobilebroadbandgroup.
33
Case Study: Click Clever Click Safe As discussed above, the UK council for Child Internet Safety is responsible for the implementation of the Byron Report. It brings together over 140 organisations and individuals to help children and young people stay safe on the internet. It was launched by the Prime Minister on 29 September 2008 and is composed of companies, government departments and law enforcement agencies, charities, parenting groups, academic experts and others. In 2009, the Council released Click Clever Click Safe: The first UK Child Internet Safety Strategy.107
www.education.gov.uk/publications/eOrderingDownload/Click-Clever_Click-Safe.pdf, accessed 4/04/11, p. 3. See the Click Clever Click Safe Website, available at http://clickcleverclicksafe.direct.gov.uk/index.html 108 (Reuters, 1996) in Yaman Akdeniz , The Regulation of Pornography and Child Pornography on the Internet, available online at http://www2.warwick.ac.uk/fac/soc/ law/elj/jilt/1997_1/akdeniz1/#a5.3 (accessed 13/09/10).
34
children organised by UNICEF and the Council of Europe.109 The EU Safer Internet Action Plan came into being as a four year action plan spanning 1999-2002, with a budget of 25 million Euros. The Safer Internet Action Plan was a three-pronged approached aimed at fostering: a favourable environment for the development of the internet industry by promoting safe use of the internet and combating illegal or harmful content.110 The three areas identified for action were: 1. The creation of a safer online environment through the establishment of a European network of hotlines and the encouragement of self-regulation and the use of codes of conduct. 2. The development of filtering tools. 3. Awareness-raising.111 The time period initially covered by the Plan was subsequently extended until the 31st of December 2004, with a corresponding increase in the budget of 13.3 million Euros. In 2005 the European Council established the Safer Internet Plus program, a follow-on initiative to the Action Plan that covered the years 2005-2008112. The plan was then re-extended for another four-year period covering 2009-2013. An Evaluation of the Program issued by the European Commission on the 3rd of November 2003 stressed that the Plan had had a positive impact in fostering networking and educating end-users about the safer use of the internet. In particular, the report concluded that: The programme has done a good job in producing a number of filtering software products although take-up of rating needs to be increased. Moreover, not all stakeholders agree that filtering is the best approach to child protection. At the policy level, the programme has been successful in putting the issues of developing a safer internet firmly on the agenda of the EU and the Member States; at action-line level, the Commission has instigated the development of a network of hotlines in Europe with associated members in the USA and Australia, funded research into tackling awarenessraising with end users, stimulated the development of filtering and supported the development of an international rating system; the programme has been successful in linking up stakeholders to produce a community of actors, although the Commission is disappointed by the lack of industry involvement as well as self-regulation organisations and consumer groups. In addition, the authors of the evaluation recommend extending the objectives of the programme to encompass new and emerging communication technologies (e.g. 3G mobile telephones) that will influence childrens use of the internet.113 The majority of these recommendations were taken up in subsequent programs, and the latest reincarnation of the Safer Internet Program has a budget of 55 million Euros is aimed at regulating not only illegal content but also harmful conduct such as grooming and bullying online114. The latest program also encompasses so-called Web 2.0 communication services, and is aimed at developing expert knowledge about existing and emerging uses, risks and consequences of online technologies for childrens lives, including the technical, psychological
109 Ibid. See also Feeley, Matthew J., EU Internet Regulation Policy: The Rise of Self-Regulation, 22 B. C. Intl & Comp. L. Rev. 164 (1999), available at http://heinonline.
org/HOL/Page?handle=hein.journals/bcic22&div=10&g_sent=1&collection=journals#170
110 European Union Legislation Summaries, http://europa.eu/legislation_summaries/information_society/l24190_en.htm (accessed 22/09/10), see also Electronic Frontiers Australia, available at http://www.efa.org.au/Issues/Censor/cens3.html (accessed 13/09/10). 111 European Union Legislation Summaries, http://europa.eu/legislation_summaries/information_society/l24190_en.htm (accessed 22/09/10) 112 Decision No 854/2005/EC of the European Parliament and of the Council of 11 May 2005 establishing a multiannual Community Program promoting safer use of the Internet and new online technologies, available at http://eur-lex.europa.eu (accessed 14/09/10). 113 COM (2006) 663, available at http://europa.eu/legislation_summaries/information_society/l24190_en.htm (accessed 24/09/10). 114 See the European Union Safer Internet Program Factsheet, available at http://ec.europa.eu/information_society/doc/factsheets/018-safer-internet.pdf (accessed
24/09/10).
35
and sociological aspects of online-related child sexual abuse.115 The program co-funds educative and selfregulatory initiatives, bringing together researchers at the European regional level, as well as providing endusers with national contact points to enable the reporting of illegal content or conduct online.116 Since 2004 the European network of awareness centres have worked with the Internet Telecommunications Union to organise the Safer Internet Day117, an awareness-raising day involving European and non-European countries that now takes place annually. The European Union Audiovisual Media Services Directive (AVMSD) The AVMS Directive sets out how every EU government should regulate online TV-like content such as YouTube and is currently the primary set of regulations pertaining to internet content in the region. First developed in the early 1980s in response to satellite broadcasting, the shared European Union audiovisual policy was revised in 1997, 2007 and again in 2010, where it was named the AVMSD118. After entering into force on 19 December 2007, member states were given until 19 December 2009 to transpose the regulations into their own domestic legislation. The goals of the AVMSD Directive, as set out on the European Union Audiovisual Services policy website119, are: providing rules to shape technological developments creating a level playing field for emerging audiovisual media preserving cultural diversity protecting children and consumers safeguarding media pluralism combating racial and religious hatred guaranteeing the independence of national media regulators. The regulations aim to be supportive and flexible, in order to strike a balance between the protection of users and the development of new business opportunities. The AVMSD differentiates between linear and on-demand services, subjecting the latter to less onerous regulatory standards, as well as seeking to promote member states use of self- and/or co-regulatory measures while eschewing new licensing schemes. 120 The AVMSD factsheet states that through these measures national legislators are thus able to, choose more flexible regulatory arrangements where these enjoy stakeholder support, align with their national legal systems and promise effective enforcement.121 The AVMSD are supplemented by the 1998 and 2006 Recommendations on the protection of minors and human dignity122. Amongst other things, the Recommendations call on industry to develop positive measures, such as harmonisation through cooperation and the exchange of best practices between the regulatory, self-regulatory and co-regulatory bodies of the Member States and to consider the possibility of creating filters for harmful content and instituting content labelling systems for material distributed online. The recommendations also urge member states to encourage media literacy and responsible use of the internet amongst children, and to implement complaints- based or remedial systems in regards to the distribution of harmful or illegal content.
115 Europeans Information Society Website, available at http://ec.europa.eu/information_society/activities/sip/policy/programme/index_en.htm (accessed 24/09/10) 116 European Union Safer Internet Program Factsheet, available at http://ec.europa.eu/information_society/doc/factsheets/018-safer-internet.pdf (accessed 24/09/10). 117 See ITU and European Commission Joint Press Release, issued 10 February 2009, available at http://www.itu.int/newsroom/press_releases/2009/01.html (accessed
24/09/10).
118 See http://ec.europa.eu/avpolicy/reg/history/index_en.htm, accessed 23/03/11. 119 Available at http://ec.europa.eu/avpolicy/reg/tvwf/index_en.htm, accessed 23/03/11. 120 See Modern Rules for Audiovisual Europe Factsheet, available at http://ec.europa.eu/avpolicy/docs/reg/avmsd/fact_sheet_en.pdf, accessed 24/03/11. 121 Ibid. 122 Recommendation of the European Parliament and of the Council of 20 December 2006 on the protection of minors and human dignity, available at http://eur-lex.
36
Conclusion
In this section we have mapped other national approaches to managing convergent media content in comparable developed countries. Our analysis suggests that the most effective approaches to media content governance involve government working cooperatively with industry to ensure voluntary regulation and governance of content and to give users options and information. It is also clear that a comprehensive and effective national
123 See IGF Website, available at http://www.intgovforum.org/cms/aboutigf (accessed 23/09/10). 124 See http://rigf.asia/asia-pacific-regional-igf-aprigf-roundtable-agenda/#aprigf (accessed 24/09/10). 125 See OECD Website, available at http://www.oecd.org/document/59/0,3343,en_2649_34255_44096251_1_1_1_1,00.html (accessed 23/09/10). 126 Ibid. 127 APEC-OECD Joint Symposium on Initiatives among Member Economies Promoting Safer Internet Environment for Children, available at
128 Ibid.
37
strategy for managing media content must include funding to promote user confidence and digital literacy. In this section we also considered current transnational efforts to provide a fora for government and industry to work together in a global context. Given the global nature of the internet, it is clear that international cooperation will play an increasingly vital role in governance.
38
129 ITU & infoDev (2010), New Technologies and Their Impact on Regulation, ICT Regulation Toolkit, available at http://www.ictregulationtoolkit.org/en/
130 Sutherland, Kevin (2009) International Training Program Next Generation Networks: communications convergence & regulatory challenges, ACMA Melbourne 30 November 2009, available, http://www.acma.gov.au/webwr/_assets/main/lib311250/next_generation_networks.pdf, 21/04/10.
39
Our objective in this section is to explore the opportunity for a coherent content governance scheme that is flexible and responsive as well as pragmatic. Our model is grounded in an adaptive rather than a clean slate approach, given that the latter represents an ideal rather than a pragmatic model. In preparing this section we hosted a roundtable discussion in order to hear expert commentary from Dr Peter Chen, from the University of Sydney, Peter Leonard, from the law firm Gilbert and Tobin, and David Simon, a former member of the Classification Board, and drew on a comprehensive international review of relevant literature. This section will cover the guiding principles that we argue should underpin media content governance, look at redefining classification and content to accommodate rapid change, and explore the road to building Australias capacity to engage in a constructive dialogue with internet companies such as Twitter, Facebook, eBay and Google. We conclude this section with our recommendations.
131 See Mayer-Schonberger, Viktor (2003) The Shape of Governance: Analyzing the World of Internet Regulation, 43 Virginia Journal of International Law: 605-673, p. 611. 132 Ibid, p. 612.
40
For most nation states, internet governance involves balancing the risks and opportunities of networked media. This balancing act involves managing the perceived harms of material deemed to harm the community or national interest with the economic, educational and socio-cultural opportunities provided by access to information and services offered through convergent media. When a given nation state increases regulation to balance these risks and opportunities, industry groups can often respond by moving to a state with different laws, thereby evading regulation and highlighting the difficulty for the nation-state to act as unilateral regulator on the global stage. As Mayer-Schonberger puts it, in the internet economy, the market incentives are tilted against the states and their enforcement efforts.133 If the nation-state is the ultimate authority for the regulation, arbitration and enforcement of internet activity within its own sovereign jurisdiction, the logical corollary is that it in fact has no jurisdiction to regulate activity outside of its own borders. International law does recognise that in some limited circumstances a state may exercise jurisdiction extraterritorially, for instance, over its own nationals when they are outside the states territory.134 Furthermore, in cases of extreme crimes, such as incitement to genocide or the distribution of child pornography, the principle of universal jurisdiction allows any state to exert jurisdiction over a perpetrator residing in their territory regardless of their nationality or the location of their crime. However, neither the principle of extraterritoriality nor that of universality provide a state with jurisdiction to regulate content hosted offshore simply because that state deems it offensive or culturally harmful. Legal theorist James Boyle argues that the technology of the medium, the geographical distribution of its users and the nature of its content all make the net especially resistant to state regulation. The state is too big, too slow, too geographically and technically limited to regulate a global citizenrys fleeting interactions over a mercurial medium.135 While the state is necessarily limited in its jurisdiction to regulate global flows of content, it must still formulate policies at the domestic level. For instance, its role in apprehending and punishing those responsible for clear-cut cases of criminal behaviour online, such as the distribution of child pornography, is essential and undeniable. Furthermore, the state is the only legitimate stakeholder that is capable of contributing to the development of policies between other states on a bilateral or multilateral basis, and is therefore indispensable to the future of the convergent media landscape. As we note below, global internet companies also have an important role in ensuring international cooperation and a vested business interest in doing so. Clearly, nation state governments must retain a robust role in convergent media governance. However, it is a role that must be strengthened through collaboration with other nation states, users and global industry groups. Industry Industry itself plays an increasingly active role in empowering and educating users through the adoption of new technologies that focus on safety and privacy concerns. In the convergent environment it is important to note that the term industry collapses a diverse group who face different issues and who need to be understood in regulatory and policy terms distinct ways: telecommunications companies, internet service providers, platform providers and, in some cases, professional content providers. We are effectively moving from a vertical to a horizontal system of network layers in the convergent environment, and it is critical that we do not simply aggregate the existing legislation.
133 Ibid, p. 617. 134 The objective territorial principle was first introduced as a theory of extraterritorial jurisdiction by the Permanent Court of International Justice in The Case of the SS. Lotus. See Walter C. Dauterman Jr., Internet Regulation: Foreign Actors and Local Harms - at the Crossroads of Pornography, Hate Speech, and Freedom of Expression (2002) 28 North Carolina Journal of International Law & Commercial Regulation 177, p. 185 135 James Boyle, Foucault In Cyberspace: Surveillance, Sovereignty, and Hard-Wired Censors (1997) available online at http://www.law.duke.edu/boylesite/foucault.
41
There is a legitimate and key role for private companies to play alongside government, especially in areas where old-media models of content regulation do not hold. Platform providers, for example, have at their disposal the tools of exclusion, removal and referral to law enforcement authorities to encourage ethical online behaviour. Barring membership and participation to users who fail to abide by predetermined cultural norms is one popular tool, and while membership of popular online communities can be taken away, this does not necessarily prevent people from rejoining (sometimes using a different email address or name). The removal of harmful content impedes other users from inadvertently coming across it, and referral to law-enforcement agencies where conduct has clearly crossed into the criminal sphere strengthens industrys capacity to arbitrate and enforce. Large private companies engaged in platform or search engine provision in the online space equally have to be aware of managing their brands in relation to user communities and perceptions of how flexibly and transparently they support the needs and views of those communities. We see a clear need for companies such as Facebook, Google and Twitter to be cognisant and respectful of the desires for users to participate in the governance of their community spaces. The furore around privacy settings on Facebook in 2009-2010, when user preferences were not heeded, led to the creation of Quit Facebook Day on May 31, 2010. While it did not create any substantial decrease of Facebooks user numbers, it did increase awareness of the concerns about privacy setting. Concerns about privacy and security were further provoked when Google created its social networking and messaging tool, Goggle Buzz, which resulted in a complaint being filed to the Federal Trade Commission in the US. The FTC found Google in breach of its own privacy principles and has instructed that it undergo regular privacy audits for the next 20 years.136 How responsively and responsibly internet companies listen to user concerns and incorporate them into their own development and governance may underpin the success of business models in the convergent media environment in the future. In a transnational context, the legal obligations of private companies remain unclear on a range of legal and policy issues. This includes censorship and freedom of speech issues, and the past decade has seen concern grow regarding the way in which companies are balancing the goal of making money alongside the need to be ethical corporate citizens. A notable case was when Yahoo!, under coercion from the Chinese government to comply with strict local laws, revealed data including login times, corresponding IP addresses, and relevant email content from the Chinese journalist Shi Tao, leading to his conviction and imprisonment on a ten year sentence.137 Brian Israel argues it is not simply a moral problem for large technology companies, it is instead about business: This business quandary is the result of conflicting standards to which ICTs are simultaneously subject: the local regulations of authoritarian states, and a global standard informed by international human rights norms and societal expectations in the companies home markets.138 In some cases, social networking sites have responded to privacy concerns by making changes to the manner in which personal information can be used. This was demonstrated by Facebooks responses to the Office of the Privacy Commissioner of Canada. In 2009, the Commissioner investigated a series of allegations about Facebooks default privacy settings and the ways in which it uses data, in the context of Canadas Personal Information Protection and Electronic Documents Act. The Commissioner found Facebooks privacy information confusing: Facebooks account settings, for example, described how to deactivate a Facebook account but did not explain how to delete an account so that personal data is removed from Facebooks servers. The Commissioner also found problems with over-sharing of users personal information with third-party developers who create Facebook applications. Facebook complied with all of the Commissioners requests and made significant changes to its platform. Facebook now has a low, medium, or high privacy setting, giving more granular control to users. The company also introduced a per-object privacy tool, giving users control at the
136 FTC Charges Deceptive Privacy Practices in Googles Rollout of its Buzz Social Network (2011), http://ftc.gov/opa/2011/03/google.shtm (accessed April 2, 2011) 137 For an extensive survey of the ways in which Yahoo! and other companies cooperate with the Chinese Government, see Human Rights Watch (2002) Race to the Bottom: Corporate Complicity in Chinese Internet Censorship, Volume 18, No. 8(C), available http://www.hrw.org/en/reports/2006/08/09/race-bottom, viewed 24/04/10. 138 Ibid. p 619.
42
time of uploading or sharing. There is also a new privacy tour for new registrants.139 The new settings impacted all Facebook users, not just those in Canada. The investigation is a powerful example of how individual nations are reaching beyond their borders to regulate transnational ICT companies. The End-User There has been a shift away from the terminology of the consumer towards names such as users and digital citizens. These terms are most commonly applied in a general sense to mean people who use networked technologies, and partly it reflects the forms of highly active engagement that can occur in these spaces. The suggestion of passivity that comes along with consumers does not adequately reflect the way people engage in mobile and online environments. How can we better understand the roles of users, the kinds of actions can they take, and how they actively shape our media ecology? Rather than thinking about users as solitary individuals, seeking out content, they can be better understood as active agents within a participatory culture.140 Users are driving the public culture of the internet, evidenced in the growth of blogs, social media sites, video and photo sharing services, and within the comments structures of all mainstream news and discussion sites. This kind of everyday participation makes or breaks online communities and internet businesses, and it is an essential part of the contemporary media industry. This represents a significant disruption of previous media models, where the consumer received a finished product, at the end of an economic chain of production, to become an active player in a dynamic cycle of ever-changing content. Users determine where and whether a community will develop online, and how long it will last. But the role of users goes far beyond simply joining up with services, accessing data and then commenting on whether it is suitable or offensive. Through their participation, they create normative language and behaviours, thus determining what will become the acceptable uses of an online space. Everything, from bonding and discussion, to fights, criticising and trolling, to creating content, downloading, and simply listening to other users, create a current of activity that eventually shapes online engagement for other participants.141 This process is one that needs to be taken seriously by media regulators. Indeed, users can be considered as the most essential part of online governance. At a basic level, they can self-regulate within their own communities, using systems such as self-rating, reporting and inbuilt complaint mechanisms, such as those seen on Facebook or YouTube. These kinds of self-regulation mechanisms are also capable of crossing borders where state regulation cannot. Critical thinking by user-groups is arguably the most effective protection available against the proliferation of harmful content. But beyond these basic functions, and more importantly, users play a vital role in determining what our convergent environments look like and how they function. From the normative effects produced by user communities to the capacity for creating thriving spaces of human interaction, users are at the heart of why things work or become non-functional. We argue that the next stage of media governance must carve out a much larger role for users at a national and supra-national level, and they should be included within all the key bodies that consider media content.
139 Stoddart, J. (Canadian Privacy Commissioner)(2009) Press Release: Facebook agrees to address Privacy Commissioners concerns, available at
140 Jenkins, H (2006) Convergence Culture: Where Old and New Media Collide. New York: New York University Press. 141 Burgess, J and Green, J (2009) YouTube: Digital Media and Society Series. Cambridge: Polity Press.
43
Which Mode of Governance is Preferable? Given the current complexity of the convergent media landscape, it is clear that reliance on any one stakeholder group alone is insufficient when it come to governance of the online environment. Pitting each regulatory extreme against the other achieves very little. Contemporary media users exercise an unprecedented level of choice and control over the content they consume and, indeed, are frequently sources of content themselves. This digital literacy is shaping media users into active media citizens who expect industry and government to consult with and inform them about risks and opportunities of media platforms and content. Media users are a stakeholder group who have been insufficiently recognised in the conventional regulatory framework for managing media content. A critical issue that was raised earlier when considering the efficacy of purely government-based forms of regulation is the practical capacity of government agencies to regulate user-generated content. To give but one example: 24 hours of video are uploaded to the internationally available site YouTube every minute of every day, and the site is home to two billion views per day.142 The amount of material generated and viewed some of it ephemeral is clearly beyond the capacity of any national or international regulatory body to monitor and regulate in real-time. In practical terms, there are simply not enough people with hours in the day to monitor and flag the sheer volume of content created by users on a daily basis. YouTube, similarly to other platforms discussed below, has confronted this issue by adopting clear protocols concerning the type of content deemed appropriate for publication and actively enlisting its users to flag inappropriate content, which is then placed in a queue, reviewed and taken down, or where appropriate, notified to relevant authorities. In this feedback loop, users accept moral agency for abiding by the guidelines of the particular site and flagging inappropriate content, industry equally assumes responsibility for providing users with a complaints and flagging mechanism, and for outlining clear guidelines for interacting with the site. The dialogue initiated by industry and participated in by user-groups is then mediated by the third stakeholder, government, which assumes responsibility for acting when illegal content is referred on to them. A mixture of all three forms of governance supplements the disadvantages and weaknesses of each mode of regulation with the advantages of the other. The procedural success of these feedback loops, as seen on platforms such as YouTube and Facebook, demonstrates the potential for similar mechanisms to be employed on a larger scale as a means to encourage and facilitate interaction, transparency and accountability between user-groups, industry and government. In their current format, users are offered minimal power to contribute to the loop, however we argue strongly that this should change in the future, with users being given greater responsibility to act as responsible agents in the governance and mediation of the platforms in which they participate, as well as the overarching state rules of media governance. Industry, government and end-users can strengthen this combination of efforts by engaging in a dialogue in which each is equally a participant and a beneficiary. For instance, government and industry can work to increase the awareness and education of the user, and community and industry can in turn educate the government about community perceptions and ethical considerations. This quid pro quo relationship should be used to advance effective policy pursuits and to facilitate legitimacy and moral agency. In turn, transnational efforts to govern internet content should not subjugate this tripartite framework to its own whims, but rather should feed into the system as another tool in Australias policy armoury to inform users, to engage the industry, and to regulate harmful activity online.
142 Chapman, Glenn (2010) YouTube serving up two billion videos daily, AP, May 16.
44
143 Fuller, Matthew (2005). Media Ecologies: Materialist Energies in Technoculture. Cambridge, MA: MIT Press. 144 Lumby, C., Green, L. and Hartley, John (2009), Untangling The Net: The Scope of Content Caught By Mandatory Internet Filtering. http://jmrc.arts.unsw.edu.au/jmrc-
public-reports-and-submissions/-untangling-the-net/.
45
46
145 Potter, H. (1996), Pornography: Group Pressures and Individual Rights, Federation Press, Sydney, p. 85 146 A C Nielsen (2006), Film and Video Content, A C Nielsen, Sydney, p 5. 147 British Board of Film Classification (2005), Public Opinion and the BBFC Guidelines, London.
47
More research needs to be undertaken into community attitudes towards media content, particularly given the recent changes in its production, use and distribution. Our view of the history of regulation indicates that Australian media classification systems have not, to date, been built on sufficient empirical evidence about actual public attitudes or on evidence about actual media consumer behaviour. We recommend further research as a critical part of developing new policy in this area. A key question that needs to be explored in rethinking media content governance is the issue of how criminal law intersects with other forms of governance. The issue needs to be understood both from the point of view of regulatory scope and of supporting and broadening stakeholder roles, given the key roles that industry and media user groups now play in potential notification. Child abuse material offers a useful if disturbing case study here. While it is imperative that the availability and distribution of child abuse material should be strictly prohibited and prosecuted, it isnt necessary to differentiate between child abuse materials distributed on the internet versus child abuse material distributed in any other place or medium. It is clear that the production and dissemination of most child pornography begins with a primary crime: a sexual assault on a child. Producing, distributing and consuming images of the assault compound the crime and are, rightly, considered criminal activities, and should be regarded as such regardless of where they are distributed. If we focus regulatory and public resources on our concern to over-regulate the presumed largest point of distribution, the internet, we risk pulling resources away from evidence-based strategies to prevent child abuse and identify and prosecute perpetrators and producers of this material. That is not, of course, to say that law enforcement resources should not be directed to identifying the online producers and consumers of such material. And clearly Australia needs a body, currently ACMA, that maintains and enforces a blacklist of child abuse related websites. We also need a whole of society approach to the problem of child abuse and material produced in its wake. If we build our media content policy around the worst case scenarios we risk building our media content governance policy on the basis of the lowest common denominator. We should, however, be strongly guided in policy by experienced law enforcers in the area and by social policy experts with evidenced-based knowledge in how to prevent the abuse of children. A key question for media content regulation online is to what extent we can and should rely on existing criminal laws to differentiate the content that is harmful and worthy of censorship from that which should be left to individual discretion. In exploring this question it is critical to acknowledge the role that active online communities who are facilitated by industry can play in notifying evidence of crime they encounter online. The UK Home Office has observed: It is important to distinguish between illegal material and material that is legal but which some would find offensive. Self-regulation is an appropriate tool to address the latter. Dealing with illegal material is a matter for the courts and the law enforcement agencies.148 Peter Leonard argues a better balance must be struck between the criminal codes and the classification codes: Ultimately this has to be solved by finding the middle ground between what is currently RC and what is criminal.
148 House of Lords, Select Committee on Science and Technology (1996) Information Society: Agenda for Action in the UK, Session 1995-96, 5th Report, London: HMSO, 23 July 1996, available at http://www.parliament.the-stationery-office.co.uk/pa/ld199596/ldselect/inforsoc/inforsoc.htm, para. 4.163
48
149 Australian Communications Consumer Action Network (ACCAN) (2009), Future Consumer: Emerging Consumer Issues in telecommunications and Convergent Communications and Media, p 20. 150 Ibid, p 21. 151 http://www.acma.gov.au/WEB/STANDARD/pc=PC_311474 152 ACMA (2009), Adult digital media literacy needs (August 2009) available at http://www.acma.gov.au/webwr/_assets/main/lib310665/adult_digital_media_
49
section 4: conclusion
Contemporary networked media are all now part of a complex ecology which draws together previously disparate platforms and participants, including governments, media industries and an international community of users. Online content is highly dynamic; it crosses borders, constantly produced and consumed for an enormous variety of purposes, and the technologies and access points are in flux. In this environment, it is clear that media governance needs to be flexible, based on up-to-date research, grounded in international dialogue, and operating within an active dialogue and collaboration between the key players who constitute the networked environment. There is no question that media governance has become more complex. However, this report has argued that there is a clear path forward, based on equitable and effective principles. The inconsistencies of the current media regulation system need to be remedied. We can no longer think of media forms vertically: existing in individual silos such as television networks, radio, newspapers, film and so forth. Rather, we need to think across the shared horizontal levels across convergent media: networks, platforms and content. Content can be accessed on a multitude of devices, from mobile phones to tablets to laptops and internet-enabled games consoles and televisions. Policies need to be technology neutral in order to adapt and remain useful. Forms of media policy that offer the most flexibility and effectiveness for the 21st century will maximise opportunities for users to participate in various spaces, while also being able to filter content at their end point of the network. Users have more agency to shape convergent media environments, and should be welcomed into governance processes as full participants. Similarly, media and technology companies need to respect the needs of users as digital citizens, and to maximise their opportunities to have a say in the design of platforms, including privacy controls and transparency of how their data is used. New models of media governance will allow for both self-regulatory and co-regulatory frameworks, as we have seen in the various international examples in this report. Further, they need to emphasise the importance of media literacy, creativity and education. Encouraging users to develop their skills and knowledge will be a more effective basis for a thriving convergent environment than punitive top-down approaches, except in the case of clearly criminal behaviour. It is clear from international experience and research that network-level filters do not increase media literacy, nor do they create a perfectly safe internet. Rather, network filtering is an opaque system that is open to abuse, as well as giving a mistaken impression that all offensive or illegal content can be removed entirely. Frameworks for acceptable content are both more effective, encourage users to engage critically with the spaces they use, and avoid the chilling effects of total network filtering. In summary, media content governance in the 21st century needs to move away from the top-down approach that has dominated content regulation in the past to embrace a system grounded in co- and self-regulatory approaches, emphasising user agency and literacy. Government has a clear role in ensuring that industry groups commit to robust codes of practice and in promoting active industry and user engagement in policy development. Industry needs to resource and demonstrate a commitment to working collaboratively to give users a voice in how their data is managed, how platforms develop, and to notify inappropriate content. In an era when it is media users themselves who are creating and exchanging much of our media content, it is essential that they are recognised as full digital citizens and given a clear role in media content governance and policy.
50
4.1. Recommendations
The authors of this report make the following recommendations to develop: 1. The Creation Of A Convergent Media Board. The Board should be comprised of representatives from government, industry and user groups. The Board should have a broad remit: to consider social, cultural and regulatory issues in relation to convergent media content and to identify areas for further policy debate and research. It should not be charged with arbitrating individual complaints about convergent media content, as it will be separate from existing regulatory mechanism and bodies, including ACMA. The Boards role is to engage with emerging technologies and to track the issues, innovative potential and community concerns that arise regarding media content production and distribution. It is the body that identifies any gaps in a broad-scale self- and co-regulatory system. The Board would provide the essential linking forum where government, industry and user groups can work together. Finally, it will act as Australias centralised point of contact with international fora addressing media content governance. We also note the critical role that ACMA plays in media governance and the importance of ensuring it is adequately resourced to monitor complaints about media content and codes of industry practice. 2. A Full Review Of The Laws That Currently Regulate Media Content. Current media content regulation works in confusing ways across criminal codes, state laws and federal laws. These laws need to be reviewed to promote consistency across Australian states and take account of the convergent media environment. We argue there is a clear need for a national R18+ category for games and that research into community attitudes supports this move. On the issue of the Refused Classification (RC) category, we argue that its current framing is too broad and uncertain in its scope, and it should not to be used as a mechanism for filtering online material. We note that the Classification Act does not offer sufficiently detailed criteria for determining whether content is RC, and it is time that the category is given careful review. 3. Government Commitment To A Self-Regulatory Approach To Media Content Management, Including The Use Of Filters. The proposed mandatory internet filtering plan should be abandoned as the scope of filtered content is far too broad, opaque to the general community, and creates a false sense of security in the community rather than enhancing user agency and literacy. In the convergent media era government needs to commit to an approach to media content management that focuses on working collaboratively with industry to enhance the capacity of users to identify and notify inappropriate media content in a transparent system. 4. Industry Commitment To Codes Of Practice That Enhance User Agency. As the range of platforms for media content multiply it is critical that industry groups commit to updating codes of practice and that compliance with these codes is monitored by the Convergent Media Board in concert with ACMA. Further, industry should demonstrate a clear commitment to enhancing user capacity to identify and notify inappropriate content, to protecting user privacy, and to giving user communities a say in how their data is used and how platforms are managed.
51
5. The Funding Of Ongoing And Excellent Research. The federal government should work with industry to adequately fund expert research conducted by existing government entities and academic researchers to ensure Australia stays in touch with public attitudes, user behaviours and emerging technologies. 6. Support for media literacy education. Government and industry should work together to fund substantial programs in schools and in the community that ensure Australians have the skills and understanding to engage with convergent media with responsibility, knowledge and security. These programs should be based on research and designed in concert with educators. Media literacy should form part of a standardised national curriculum and include education about online security, ethics and literacy. 7. The building of national and international frameworks and links to ensure that government, industry and user groups have input into Australian policy and law making. In the convergent media environment, government, industry and media users need to work together and they require fora which ensure that their dialogue has concrete public policy and law outcomes. The Convergent Media Board should actively identify and promote links with relevant international fora and agencies, and work in concert with ACMA to ensure Australians are positioned to reap the opportunities of the convergent media era while minimising the risks.
52
53
Gave the ABA the power to issue special take down notices or special access-prevention notices prohibiting ICHs from hosting and requiring ISPs to block any content that was the same, or substantially the same, as prohibited content identified in a prior take down notice (s 47) Required that industry codes be developed (s 60) Mandated that notices had to be complied with by no later than 6pm the next business day (ss 37(1)-(3)) Exempted ICHs and ISPs from liability for breach of State or Territory laws in regards to carrying offensive material if they were unaware of its presence (s 91(a) and (c)), and with respect to any requirement by a State or Territory that an ISP or ICH monitor, make inquiries about or keep records of internet content carried or hosted by them (s 91(b) and (d))
54
153 For more information: Arasaratnam, Niranjan, Brave New (Online) World, (2000) 23(1) University of New South Wales Law Journal 205, pp. 10-13; Chen, Peter; Pornography, Protection, Prevarication: The Politics of Internet Censorship, (2000) 23(1) University of New South Wales Law Journal 221, pp. 18-20; Scott, Brendan, Silver Bullets and Golden Egged Geese: A Cold Look at Internet Censorship, (2000) 23(1) University of New South Wales Law Journal 215; Chen, Peter, Regulating the Internet Censorship? Australias Internet Censorship Regime: History, Form and Future, 3 Macquarie Law Review (1999), pp. 121-142; Coroneos, Peter, Chapter 4 - Internet content policy and regulation in Australia, [2008] SydUPLawBk 10; in Brian Fitzgerald, Fuping Gao, Damien OBrien, Sampsung Xiaoxiang Shi (eds), Copyright Law, Digital Content and the Internet in the Asia-Pacific (2008) 49 154 CSIRO, Blocking Content on the Internet, June 1998, available at www.cmis.csiro.au/projects+sectors/blocking.pdf, accessed 25/03/10 155 Arasaratnam, Niranjan, Brave New (Online) World, (2000) 23(1) University of New South Wales Law Journal 205, pp. 10-13, p. 13
55
156 Ibid, p. 31.5 157 See ACMA Media Release, 21 December 2007, available at http://www.acma.gov.au/WEB/STANDARD/pc=PC_310907 (accessed 3/09/10).
56
158 See Coroneos, Peter, Chapter 4 - Internet content policy and regulation in Australia, [2008] SydUPLawBk 10; in Brian Fitzgerald, Fuping Gao, Damien OBrien, Sampsung Xiaoxiang Shi (eds), Copyright Law, Digital Content and the Internet in the Asia-Pacific (2008) 49, p. 63 159 Ibid, p. 63 160 Lindsay D, Rodrick, S and de Zwart M, Regulating Internet and Convergent Mobile Content (2008) 58 Telecommunications Journal of Australia 31.1-31.29; pp. 31.7-31.8 161 Ibid, p. 31.9
57
58