Transparency report

Parent entity: Mega Limited
Content-sharing service: MEGA

mega.io

Publication date: Oct 6, 2022, 10:07 AM UTC

Reporting period: 1 January – 30 June 2021

Section 1
#Q1

Which of the following best describes the primary type(s) of online service(s) that you provide?

  • Cloud-based storage and sharing
  • Messaging
  • Video chat
  • Video sharing

#Q1_comments

If you would like to add a comment or other information, please do so here.

Mega provides end-to-end encrypted cloud storage and text/voice/video chat.

Section 2
#Q2

Do you prohibit terrorist and/or violent extremist content on your service?

Yes

#Q2_yes

Do you use one or more specific, publicly available definitions or understandings of terrorist and/or violent extremist content?

Yes

#Q2_yes_yes

Please provide the publicly available definition(s) or understanding(s) that you use, along with your relevant terms of service or policies.

MEGA's Terms of Service and Takedown Guidance Policy make it clear that MEGA has zero tolerance for violent extremism, and references s3 of the Films, Videos, and Publications Classification Act 1993.
See https://mega.nz/takedown
https://mega.nz/terms

#Q2_comments

If you would like to add a comment or other information, please do so here.

Violent extremism is processed in the same manner as other objectionable material as defined in section 3 of the New Zealand Films, Videos, and Publications Classification Act 1993, or other Internet-harming material, including as defined by the Harmful Digital Communications Act 2015: there is immediate deactivation of the folder/file links, closure of the user’s account and provision of the details to the New Zealand Government Authorities for investigation and prosecution.

Section 3
#Q3

Do you use any of the following methods to detect terrorist and/or violent extremist content on your platform or service?

  • Flagging by individual users or entities
  • Government referrals
  • Government legal requirements
  • Cross-company shared databases or tooling
  • Trusted notifiers

#Q3_1

Can you determine the total amount of content that was flagged or reported as terrorist and/or violent extremist content on your service during the reporting period?

Yes

#Q3_1_yes_1

Are you willing and able to disclose it?

Yes

#Q3_1_yes_1_yes

How much content, in total, was flagged or reported as terrorist and/or violent extremist content on your service during the reporting period?

2021 File Link Folder Link Total
Q1 419 9 428
Q2 347 15 362

#Q3_1_yes_2

Can you determine the amounts of content that are flagged or reported as terrorist content separately from the amounts of content that are flagged or reported as violent extremist content on your service?

No

#Q3_comments

If you would like to add a comment or other information, please do so here.

We remove duplicate reports from our public statistics.

Section 4
#Q4

Can you determine the total amount of content that is flagged or reported as terrorist and/or violent extremist content according to the method of detection?

Yes

#Q4_comments

If you would like to add any comments or you can provide any relevant data, please do so here.

The totals provided here are reports, not URLs as in Section 3, because we do not track URLs in the disaggregated format.

Section 5
#Q5

Please select all interim or final actioning methods that you use on terrorist and/or violent extremist content.

  • Content removal
  • Suspension/removal of account
  • Content blocking

Section 6
#Q6_1

Can you determine the total amount of terrorist and/or violent extremist content on which you took action during the reporting period?

Yes

#Q6_1_yes

Are you willing and able to disclose it?

Yes

#Q6_1_yes_yes

Please provide that amount, along with any breakdowns that are available.

All links are immediately deactivated, the user's account closed, and details provided to the New Zealand authorities..

#Q6_3

Can you determine the total number of accounts on which you took action during the reporting period for violations of your policies against the use of your service for terrorist and/or violent extremist purposes as a percentage of the average number of monthly active accounts during the reporting period?

Yes

#Q6_3_yes

Are you willing and able to disclose it?

Yes

#Q6_3_yes_yes

Please provide that percentage, along with any breakdowns that are available.

The accounts that were closed because they shared Violent Extremist files represented 0.0001% of MEGA's total registered users.

#Q6_comments

If you would like to add other comments or information, please do so here.

We took action on 100% of the content that was flagged as TVEC.

Section 7
#Q7

If your service includes livestreaming functionality (even if it is not among what you consider to be the primary functionalities), then given the potential for terrorists and violent extremists to exploit livestreaming in ways that could promote, cause, or publicize imminent violence or physical harm, do you implement controls or proactive risk parameters on livestreaming to reduce misuse?

No livestreaming functionality

Section 8
#Q8

Please provide details on how you balance the need to action terrorist and/or violent extremist content with the risk that such content can be mislabelled and may actually be denouncing and documenting human rights abuses, or that it does not otherwise violate your terms of service.

If any user disputes the action we have taken in closing their account because of alleged Violent extremist content sharing, MEGA requests the New Zealand Government agency to review the content and advise whether it was VE or was mis-reported. In the latter case, the user's account can be reopened.

Section 9
#Q9

Do you have an appeal or redress process for content and/or account actioning decisions made under your terms of service on terrorist and/or violent extremist content?

Yes

#Q9_yes_1

Please provide a detailed overview of those processes.

If any user disputes the action we have taken in closing their account because of alleged Violent extremist content sharing, MEGA requests the New Zealand Government agency to review the content and advise whether it was VE or was mis-reported. In the latter case, the user's account can be reopened.

#Q9_yes_2

Is your appeal or redress process available to the user who posted the content or owns the account in question?

Yes

#Q9_yes_2_yes

Is the outcome of your appeal and redress process available to the user who posted the content or owns the account in question?

Yes

#Q9_yes_3

Is your appeal or redress process available to the person or entity who requested actioning?

No

#Q9_yes_4

What is the total number of appeals received from all sources, during the reporting period, following content or account actioning decisions under your policies against terrorist and/or violent extremist content?

29 appeals across all types of actioning were received during the reporting period. The number relating just to violent extremism is not available.

#Q9_yes_4_1

How many such appeals were decided during this reporting period (regardless of when those appeals were received)?

29 appeals across all types of actioning were decided during the reporting period. The number relating just to violent extremism is not available.

#Q9_yes_4_2

Of those, how many were granted?

1

#Q9_yes_4_3

If you can break these numbers (appeals received, decided and granted) down with any more detail, please do so here.

N/A

Section 10
#Q10

How, and how often, do you measure and evaluate the efficacy and/or room for improvement of your policies in each of the following areas?

MEGA monitors the type of reports being received and has a continuous improvement process.

Section 11
#Q11

Do you have a point of contact (such as a dedicated email alias, desk or department) that can be contacted during a real-world or viral event with direct online implications and which works to address harmful content on your service?

Yes

Section 12
#Q12

Are you a member of an international crisis protocol aimed at reducing the volume and impact of terrorist and/or violent extremist content online during a crisis?

Yes

#Q12_yes_1

Please identify the protocol.

GIFCT CIP

#Q12_yes_2

Did your company participate in or benefit from the activation of a crisis protocol aimed at reducing the volume and impact of terrorist and/or violent extremist content during the reporting period?

No such crisis protocol was activated during the reporting period.

#Q_metrics

If you wish to report any additional metrics or information, please use this space to do so.