The altar of technology demands many sacrifices.
We sacrifice our time by keeping current with the endless streams of social media and news. We sacrifice a bit of our wealth in the constant desire to keep up with the latest technology that promises to be only slightly more convenient or innovative than the previous model.
We can even sacrifice our personal security by using “free” services that easily store our information for us, but are also susceptible to breach and thus exposure of our sensitive data. However, the largest sacrifice we, as a society, offer to the altar of technology is the loss of many aspects of personal privacy and it would seem that most of us are frankly okay with this.
Consider for a moment that in this changing attitude towards what we consider private, our definition of the word has changed because of the mass drive towards centralization of data.
Read any free cloud provider’s End User License Agreement (EULA) and, once you wake up from the experience, you’ll remember that in the fine print you’ve essentially agreed to let the cloud provider data mine your information for marketing and analytics purposes.
Since only about 7% of us actually read this, and I’m fairly certain that 99% of that 7% are probably the lawyers that draft these things, none of us really know the extent of the data the providers can extract.
This blind acceptance is the primer for the conditions of changing the properties of what privacy actually is. In this vein, we can now see the shift in ourselves and what we have truly sacrificed.
Human Habits Are Now On Full Display
Around 2012, a rather angry father of a high-school-aged girl walked into his local Target store outside of Minneapolis and demanded to see a manager. He wanted to know why Target was direct mailing his daughter advertisements for baby cribs and other baby-related items.
Was Target trying to encourage his teenage daughter to get pregnant? As you may have already guessed, Target had inadvertently outed this man’s daughter as secretly pregnant and he apologized to Target after she disclosed her situation to him.
Target, in order to enhance their sales, had been data mining this girl (as well as the millions of patrons that walk into Target daily) in an attempt to identify her needs and habits based on spending patterns.
This newly pregnant patron began buying items at Target that were consistently purchased by women who were pregnant. Therefore, the database flagged her with a high probability of being pregnant and, wanting her to purchase more, sent her advertisements the database deemed correct for her situation.
This is a textbook example of where privacy has been diluted by technology. Never mind that Target has statistical data that allows them to categorize minors as pregnant; they have the ability to predict very personal and obviously private information based on the patron’s interaction with their network of stores and spending habits.
Use a credit card in their store, Target-branded or otherwise, and you’re in a database that reveals more about your life than you may care to admit.
Buying self-help books? Target knows you’re possibly depressed. Add a prescription for Prozac at their pharmacy, or movies that are nothing but comedies and now they can run a statistical model that tells them you’re 85% likely to be depressed.
Listed as married in their database? If you suddenly start buying frozen dinners and cheap furniture then odds are you’re going through a divorce.
You can run any number of combinations of data to see patterns emerge that allow for highly personalized results from any demographic you can think of. This is Big Data and Analytics 101, as well as a cybersecurity nightmare (but more on that one later).
When Free is a Threat to Privacy
We all have free cloud accounts. Google Drive, Apple iCloud and other free services offer an easy and convenient way to sync your pictures, files, email, contacts and pretty much everything else in your technological life.
They’re simple to set up, easy to use and, best of all, they’re free. What many individuals do not realize is they are sacrificing a good deal of private information for this ease of use.
Sure, everyone knows that Google is reading their email so they can more effectively advertise to you. Mention you’re looking for a new car in an email to a friend and soon you’ll get car advertisements popping up. However, it goes much deeper than this. Most people don’t realize that many of these services allow their company employees the ability to access these accounts.
In a rather notable case, an employee at Google was fired when he began stalking underage kids online. He went directly into their Gmail accounts to read their emails, listen to their Google Voice messages and insert his own information into their accounts.
Further, because free cloud services usually tend to favor ease of use over privacy and security concerns, we are endlessly reading about large-scale breaches that expose the privacy of their users. In the last year alone, we’ve had multiple breach disclosures that many users of these providers may not have heard about unless they’re actively paying attention.
It was discovered in mid-2016 that Dropbox’s 2012 breach of around 7 million passwords was actually almost 70 million in size! From a cybersecurity standpoint, that is utterly ridiculous when it comes to reputation damage mitigation. However, Dropbox’s disclosure is amateur hour when compared to Yahoo’s public disclosure that 500 million mailboxes were exposed to hackers.
This breach was so deep and massive that Yahoo claimed it could have only happened by a state-sponsored effort. Verizon, who is interested in purchasing Yahoo, wants a $1 billion price reduction on Yahoo’s sale price. Verizon knows this deeply impacts the public’s trust in the Yahoo brand and that people will now start migrating to other free email services like Gmail, though they’re also not immune to breaches.
Naturally, Yahoo responded to this in the worst way possible; they made it harder for people to leave them by removing the ability of users to forward email to another account. Some user forums have complained that their external email clients, like Microsoft Outlook, are now constantly asking for login credentials which could indicate a shorter timeout period to prevent people from copying out their data using a local export.
We as a society need to realize that our data must remain private even if it’s not sensitive. These kinds of intrusions on privacy from provider employees and breaches should not be tolerated. We have to hold these companies accountable. Many are willing to sacrifice the privacy of their purchases and words for free space online, however, no one should tolerate a lax infrastructure that doesn’t keep our data safe.
Device Ownership Isn’t Tantamount to Sharing, Except When It Is
Forget for a moment that free cloud provider employees can possibly read your private emails, listen to recorded voicemails and other privacy violating things. Also, forget that hackers are hitting these free services 24/7 trying, and succeeding many times, to expose your information to the world. Let’s take a look at the companies that are secretly tracking you even when the correct settings are turned off!
Apple may be the hip brand of choice for everyone but they, like many other large tech corporations, have a bit of a privacy PR issue. Recently it was disclosed that Apple iPhones will send the iPhone user’s call history to Apple even when their normal iCloud backup service is turned off. And it’s not just the Phone app it’s sending the history from.
It’s Skype, WhatsApp, Viber and others as well. What are they doing with your call history, you ask? Feel free to wildly speculate! We’re not completely sure but it probably involves marketing analytics as well as competitive information for tracking call recipients who don’t have Apple products.
Don’t think Google is off the hook here either. Their mobile phone operating system, Android, is a nice free open source platform that nearly every cell phone maker (except Apple, of course) is using to run their mobile phones. This creates an interesting dilemma, though.
You can get a $50 mobile phone with nice features including one that will periodically send your text messages to a Chinese company free of charge. About 700 million Android-powered phones and other mobile devices have this “feature” and again, we’re not completely sure what they’re doing with this data. It’s a major violation of a user’s privacy.
We’re not done with Google, unfortunately. Google can obviously claim that anyone can use their Android OS and pervert its use for their own nefarious means, but let’s look at a Google-developed and published platform that is not only supplied free to the public, but Google also has their own release of it: Chromium. If this doesn’t sound familiar, it’s the development platform that Google’s Chrome web browser is built on.
In 2015, it came to light that lines of code had been inserted into Chromium, and therefore Chrome, that allowed Google to basically turn on any Chrome user’s microphone, record audio and then transmit it to back to Google . Care to wildly speculate on this one as well?
This particular instance, it was originally reported as a “bug” and declared fixed by Google. However, the particular module that is doing this, Hotword, still exists and runs on millions, if not billions, of browsers. For those of you currently turning off your computers and throwing them out, here is how to disable it.
Thought I forgot about Microsoft, did you? It’s come to light that Microsoft, for the first time in their history, is going to share telemetry data from user’s Windows 10 computers, laptops and phones with a third party. This data can include the list of apps running on the device, any log files regarding system crashes, and other stats.
They’re sharing this with FireEye, a known major player in cybersecurity threat intelligence with the hopes of enhancing the threat detection in Windows, which will add a new machine-learning feature to Windows. Assuming you pay for it, of course. It’s basically a new kind of integrated virus scanner.
While I’m all for anything that increases our intelligence against the threats currently out there, this move is also disturbing to privacy advocates because it reverses a long-standing Microsoft policy. This means it may open the door for more 3rd party sharing in the future. Also, there is no opt-out option for those privacy-minded users at the moment, though that option may never materialize.
Growing Up in the Age of Questionable Privacy
My preschool age daughter has grown up in a much different world than I did. When she was an infant, she recognized three central figures in her life: Mommy, Daddy and Camera (usually held by Mommy or Daddy). She has grown up with a camera in her room (encrypted and protected; I am a cybersecurity nerd after all), so it’s not unusual for her.
I often wonder how this will impact her. She thinks nothing of cameras around her and one day may grow up to be like the youth of today, who have a completely different concept of what privacy is.
Information many people would consider private or not really appropriate to share is constantly being put into public spaces like social media. Thanks to Facebook I have learned about others’ devastating medical conditions and deep-rooted family drama. I didn’t ask for this information but it’s delivered to me nevertheless.
While I respect a person’s right to use social media as they see fit, I can’t help but see the serious issues with these choices from a cybersecurity standpoint. People don’t realize how damaging information can be used against them and while many think they’re posting only to their select group of friends their sharing settings may be otherwise configured.
What we have is an evolving language revolving around privacy. Some information, like medical information, is no longer deemed sensitive enough for some people to censor.
At a more basic level, a simple individual geographical movement is no longer private. I don’t even have to cite a reference here. If you have a smartphone or a car with a service like OnStar in it then your movement is tracked 24/7 by the corporations that supply your hardware, software and service.
At the government level, we now have two rather alarming movements happening in this evolution. Recently a group of federal prosecutors in California have claimed that it is constitutional to go into a person’s home without a warrant to demand fingerprint unlock access to devices from everyone in said household.
This is scary in that it would appear that, in violation of the Fourth Amendment which protects citizens against unlawful search and seizure, this would circumvent established law. I’m obviously no attorney but this stance has been distressing for many.
This anti-privacy position comes on the heels of a recent FBI investigation where government hackers used a single warrant to hack into 8,000 people’s computers in about 120 different countries.
The FBI was breaking up a Dark Web child pornography ring and while I am 100% in favor of ending horrific things like this, the carte blanche method they used based on a single warrant when typically, several thousand warrants would need to be issued internationally, has many privacy advocates up in arms. If the FBI is willing to abuse the warrant system for something like this, they could easily abuse it for other things as well.
Not to be outdone, the President-Elect has recently indicated that he is against Net Neutrality, claiming that it was a “power grab” measure by the FCC to increase needless regulation. While I’m not really a regulation advocate we do need to balance fairness of the general public’s access speed against corporate interest.
This is relevant to the privacy issue in this article because those smaller companies that offer “privacy first” products may end up paying higher costs for internet access thus hindering their ability to grow their customer base. We all need better privacy and while it may cost us a bit I would hate to see this needed service skyrocket in price because the backbone providers are able to jack up the cost unfairly.
The Difficulty with Our Stars
Privacy is a tough nut to crack for many cybersecurity experts when dealing with high-profile clients. Sure, we can lock down accounts, enable two-factor authentication, counsel against using any public social media platforms unless it’s for public-only thoughts and advise them on how to remain virtually invisible to the outside world, but it becomes a battle thanks to the major corporations.
If a celebrity is using an iPhone or Android and the device is secretly sending data back to Apple or Google, it’s a serious issue in that we have no idea who could potentially use it or if that free Gmail account that they absolutely refuse to give up (this is an all-too-familiar debate) isn’t being browsed by Google employees without consent.
Cybersecurity professionals can lock down the front end very effectively but if we cannot control the backend and infrastructure, which can be impossible depending on the client’s choices, then it’s our job to convey just how exposed they can be and hope they take it as seriously as we do.
In the next few years, we are going to see serious challenges to our privacy rights which will set the precedent for how future generations may possibly interact with each other and what they choose to consider as private.
What we should not let happen is an unwanted invasion of our rights by others. Ideally, we should be able to choose the level of privacy we truly wish to maintain, even if it costs a bit extra to enforce and there should be penalties for those who violate what we personally deem private.
Want More Cybersecurity Insight?
Learn more from Nick and other cybersecurity insiders on the future of data privacy, security and protecting infromation? Sign up today to get our free guide to IT compliance!