EXECUTIVE SUMMARY:

As an increasing amount of private data is collected, sold and spread, the security of said data has come under augmented scrutiny.

Last week, congressional representatives grilled high tech execs on the subject.

“Do you think an average consumer who uses your products fully understands Google builds a profile about her, tracks where she works, tracks where her boyfriend lives, tracks where she goes to church, tracks where she goes to the doctor?” asked Senator Josh Hawley, hinting at the stalkerish element of data collection, and the lack of transparency regarding security.

Beyond the threat presented by the sheer volume of data bound for potential distribution lies the threat presented by machine learning programs that can be used to analyze the data. This is the scariest of all the threats.

Using machine learning, researchers found that anonymously generated computer code could be identified based on patterns in other codes with known authorship.

If this sounds benign, machine learning researchers have also found that types of cars, as made visible through Google’s Street View image database, could be used to determine local political affiliations, and that online search history can be used to detect neurodegenerative disorders, like Alzheimer’s.

Aggregation and solicitation of this type of information could wreck reputations the world over. The most substantial risk to privacy and security comes not from the information that we choose to share, but rather from the information that we did not consent to disclose.

Get the full story at TechRepublic.