AI, the Internet of Things, and Post-Quantum Encryption are set to have a significant impact on data privacy.
AI models are only limited by the size of the datasets they’re trained on, meaning companies are scraping all publicly available sources of data to feed their models.
AI can process vast amounts of data in seconds and identify anomalies that may be too subtle for human attention.
Smart devices connect to the internet, collect data, and may share it with third parties, so it’s important to review privacy settings and limit data collection.
Post-quantum cryptography is still under development, as current encryption standards are expected to become vulnerable with the arrival of new quantum computers set to arrive by 2030.
Harvest now, decrypt later is a strategy that bad actors may use to store encrypted data now and decrypt it in the future with the arrival of new technology.
Organisations, governments, and businesses already take precautions against Q-Day attacks, but consumers can also stay informed and adopt post-quantum encryption standards for safer products.
Opting out of having personal data used to train AI is an available option on social media platforms and some dating apps.
Privacy policies are evolving quickly in response to AI, but currently everything posted publicly can be used to train AI models.
Limiting post visibility settings to “friends” or “followers” only is a way to protect personal information (for now).