this product is unavailable for purchase using a firm account, please log in with a personal account to make this purchase.

National law reform: Watching Big Brother

Every Issue

Cite as: (2008) 82(10) LIJ, p. 72

The ALRC’s final report on Australian privacy law considers the impact of new technologies on individual privacy.

On 30 January 2006, the Australian Attorney-General asked the Australian Law Reform Commission (ALRC) to examine whether the Privacy Act 1988 (Cth) and related laws and practices continued to provide an effective framework for the protection of privacy in Australia. On 11 August 2008, the ALRC’s final report, For Your Information: Australian privacy law and practice (ALRC 108), was tabled in the Australian Parliament.

ALRC 108 comprises three lengthy volumes and contains 295 recommendations for reform. It is the product of 28 months of extensive research into privacy laws and involved the largest consultation program in the 33-year history of the ALRC. During the course of the inquiry, the ALRC met with approximately 250 stakeholders, held several roundtables and public forums on privacy-related issues, and conducted a number of workshops on privacy with children and young people. It also received 585 written submissions from a broad cross-section of individuals, government agencies and private sector organisations.

Issues examined in ALRC 108 include inconsistency of privacy laws, the privacy of personal health information, exemptions from the Privacy Act, cross-border data flows, and whether the Australian government should introduce data-breach notification requirements and a statutory cause of action for a serious invasion of privacy. Several recommendations relevant to the impact of developing technology on privacy are discussed in this article.

Developing technology

The ALRC’s 1983 report, Privacy (ALRC 22), observed emerging computer power and associated privacy concerns. Since the release of ALRC 22, rapid advances in information, communication and surveillance technologies have created a range of privacy issues.

ALRC 108 examined the privacy implications of a number of technologies, including biometric systems, smart cards, radio-frequency identification (RFID), location detection technologies, the internet, and ubiquitous computing. These technologies facilitate easier, cheaper and faster methods by which information may be collected, accessed, aggregated, communicated and stored. Many stakeholders were concerned that the features of new and emerging technologies could lead to interferences with the privacy of individuals, and identity theft or other discriminatory outcomes from a breach of privacy.

The ALRC’s approach to regulating developing technology is set out in Part B of the report; however, the impact of technology on privacy is addressed throughout ALRC 108. Part J, for example, makes several recommendations directed towards certain telecommunications technologies, and Part G considers the use of online social networking sites by children and young people.

Technology-neutral privacy principles

Currently, the Privacy Act does not regulate specific technologies used by agencies and organisations to handle the personal information of individuals. In its inquiry, the ALRC considered whether certain types of technologies pose such a threat to privacy that the Act should be amended to regulate them directly.

The ALRC observed that existing technologies do not fundamentally alter the information-handling cycle. For example, surveillance devices and RFID systems may facilitate the collection of personal information without the knowledge or consent of an individual, but the collection of the information will still be regulated by the recommended “Collection” principle in the ALRC’s model Unified Privacy Principles (UPPs). Similarly, the storage and destruction of personal information that is held in an electronic form must take place in accordance with requirements in the recommended “Data Security” principle.

The ALRC concludes that high-level, technology-neutral privacy principles are the best way to regulate developing technology. It is not desirable to amend the privacy principles on the basis that technologies yet to be invented or deployed may not be accommodated by the existing principles. Technology is developing at such a rate that any attempt to regulate specific technologies through the Privacy Act has the potential to render the Act quickly out-of-date. Moreover, regulation of this nature is consistent with the ALRC’s overall approach to regulating privacy.

A technology-aware regulatory framework

The ALRC recognises that technology-neutral privacy principles will be effective only if the regulatory framework that supports the primary legislation is technology aware. Key aspects of a technology-aware regulatory framework include:

  • the potential for technology-specific regulations or legislative instruments;
  • promotion of privacy-enhancing technologies (PETs), and in particular the incorporation of PETs at the stage of systems design;
  • a proactive Office of the Privacy Commissioner (OPC) that provides guidance and education to agencies, organisations and individuals; and
  • engagement on the part of the OPC and the Australian government with regulators in other jurisdictions and international bodies concerned with privacy.

Technology-specific regulation

There is scope in the Privacy Act for co-regulation, which can provide industry-driven rules for the development and deployment of certain technologies. The Biometrics Institute Privacy Code is an example of a code that was initiated by the biometrics industry and, following approval by the OPC, became a legislative instrument.1

If in the future the OPC found it necessary to initiate a code to address the handling of personal information using a certain technology, such as RFID, the OPC could lobby the minister responsible for administering the Privacy Act to have such a code included in regulations.

Technology-specific regulations or other legislative instruments offer a more flexible approach than amendment to the primary legislation. Such instruments are consistent with the ALRC’s three-tiered approach to privacy regulation (primary legislation, complemented by legislative instruments and guidance), and do not represent a failure of technology-neutral privacy principles. Instead, they indicate that information handled by particular technologies may require stronger protection in certain, limited circumstances.

Privacy-enhancing technologies

The way that technology is used often determines whether its impact is privacy enhancing or invasive. The term “PETs” can refer to technologies that:

l form part of the architecture of technology systems used by agencies and organisations (for example, mandatory access control devices or identity management systems); or

l are used by individuals to exercise control over the collection of their personal information (for example, encryption and RFID signal blockers).2

ALRC 108 concludes that PETs can enhance security and promote trust and, therefore, are an essential component of the regulatory structure. Some PETs, however, can be physically unwieldy and costly to implement. Moreover, use of PETs may require a certain level of technological expertise. PETs alone cannot address the impact of developing technology on privacy and they should complement the privacy legislative and regulatory structure rather than be the central touchstone of that structure.

The role of the OPC – proactive regulation

A technology-aware regulator plays a crucial role in ensuring that new and emerging technologies do not have an adverse impact on privacy. The ALRC concludes that the OPC should:

l provide guidance on how certain requirements in the model UPPs can be met by agencies and organisations using particular technologies to handle personal information;

l educate individuals about how PETs can be used to protect privacy;

l educate agencies and organisations about designing and deploying new and developing technologies in a privacy-enhancing way; and

l promote the use of Privacy Performance Assessments3 and Privacy Impact Assessments4 to ensure that PETs are incorporated into systems and processes at the stage of systems design.

International engagement

The global nature of technology development and deployment requires industry, consumer and privacy advocates, the OPC and the Australian government to coordinate and engage with others in the international arena. International engagement would assist the OPC to develop technology-specific guidance on the application of the model UPPs – for example, the OPC could refer to the work of international standards-making bodies in determining minimum privacy and security benchmarks that should be met when designing technical systems.

The internet

ALRC 108 makes specific recommendations to address privacy concerns about the internet, a technology that has fundamentally altered the way individuals, government agencies and private sector organisations interact. The internet has also changed the nature of the “public domain”. During the course of the inquiry, a number of stakeholders expressed concern about the amount of personal information published on the internet, the permanent nature of the publication of that information, and the ease with which others could access, search and collate that information to create a profile of an individual.

Jurisdictional issues and mirror sites make it difficult to enforce standards for the collection, use and disclosure of personal information contained in online publications. Individual privacy concerns must be balanced against public interests – such as accessibility and transparency – in making certain types of information publicly available. In some circumstances, these public interests remain relevant for generally available publications published in an electronic format.

The ALRC concludes that the most appropriate way to address privacy concerns about the internet is to limit the amount of information that is made available online in the first place.

Public registers

Public registers provide an illustration of the ALRC’s approach in this area. In the late 19th century, governments began systematically to compile and retain records of their citizens. Today, legislation may require these records to be used to create public registers. Public registers often promote important public interests – for example, a publicly available electoral roll promotes free and fair elections. There is, however, a tension between the public interests served by a public register of information and the privacy of individuals included on the register. This is exacerbated when it is compulsory to provide the information that is included in the register.

Legislation establishing a public register can limit the personal information that should be placed on the register, and set out the permitted uses and disclosures of personal information acquired from the register. ALRC 108 concludes that federal legislative instruments establishing public registers containing personal information should clearly set out any restrictions on the electronic publication of that information.


ERIN MACKAY is a legal officer with the Australian Law Reform Commission. For further information or to obtain a copy of ALRC 108, visit the website http://www.alrc.gov.au or ph (02) 8238 6312.

1. Privacy Act 1988 (Cth), pt IIIAA.

2. Commission of the European Communities, Communication from the Commission to the European Parliament and the Council on Promoting Data Protection by Privacy Enhancing Technologies (PETs), COM(2007) 228 (2007), 3.

3. The OPC has a number of functions under the Privacy Act to audit compliance. ALRC 108 recommends that audits conducted by the OPC should be referred to as “Privacy Performance Assessments” to emphasise the educational and non-confrontational nature of the process.

4. A “Privacy Impact Assessment” (PIA) considers the privacy effects of a proposal or an activity. PIAs are most effective when carried out in the design stage of a new project and integrated into the decision-making process for the project.

Comments




Leave message



 
 Security code
 
LIV Social
Footer