Thousands of victims have sued Apple over its alleged failure to detect and report illegal child pornography, also known as child sex abuse materials (CSAM).
Apple scrapped a controversial CSAM-scanning tool last fall due to concerns raised by digital rights groups and potential misuse of the tool.
Survivors of child sex abuse have accused Apple of using cybersecurity as an excuse to ignore its mandatory CSAM reporting duties.
If Apple loses the lawsuit, it could face over $1.2 billion in penalties and may be required to implement CSAM-detecting measures on iCloud.