Syrian Electronic Army takes over MarkMonitor; dog catches car

Facebook hijack attempt

On Wednesday, the well-known hacktivist group Syrian Electronic Army managed to get into MarkMonitor’s management portal. Once there, the group targeted Facebook: they used the management portal to alter Facebook’s WHOIS record, including Facebook’s registrant contacts, intending to hijack Facebook’s domain nameservers (as they had done earlier with both PayPal and eBay’s UK sites).

However, the hijack took “too much time,” and while they were able to take some screenshots of their process, they never got much farther than a prank. Facebook’s service was never disrupted, and shortly afterward, the SEA were kicked out of MarkMonitor’s management portal.

Here’s what should get people’s attention: these guys (maybe girls, too; let’s use “guys” as a unisex term) were in MarkMonitor’s management portalFor the rest of the evening, there was very little news about the incident, other than some limited reporting on the aborted attempt at hijacking Facebook’s domain. Facebook gets people’s attention.

So who is MarkMonitor? MarkMonitor does brand protection for pretty much everybody you’ve ever heard of. Sure, you trademark attorneys may just know them as that company that looks after trademarks on the internet, but they do more than that. They protect against things like fraud, spam, phishing, malware, and piracy. They monitor internet traffic to look for these sorts of things and collect data on the things they find. And, more importantly, through this activity, they hold the keys to how every brand that uses them is perceived — how every brand is trusted — by the rest of the world.

The MarkMonitor Portal is currently unavailable...

So MarkMonitor got popped, and they got popped deeply enough to give some random attackers full access to their management portal. Which means the attackers could have done anything they wanted to. They could have looked at, copied, or changed any data they wanted to. What they chose to do was make a lot of noise trying to hijack Facebook in a really obvious way, and MarkMonitor discovered the breach and kicked the attackers out.


So think about this for a minute: what if they’d been quieter? What if their attack had been much more subtle? And…how do we know that while they were annoying Facebook, they weren’t also doing something else?

What Does the CalOPPA Update Mean To You?

Edit, September 27, 2013: Governor Brown has signed AB370 into law. It is now the law in California.

Original post from Sept. 5 follows:

Last week, California’s State Senate unanimously passed AB370, which, if passed by the Assembly, will add new requirements to the California Online Privacy Protection Act. The California Online Privacy Protection Act (CalOPPA) is the law that requires anyone doing business online in California (therefore pretty much every business online) to post a privacy policy conspicuously on their website, and sets out the requirements of what that privacy policy needs to say.

As it currently stands, CalOPPA has only a few basic requirements. A privacy policy has to:

  • Be posted conspicuously on the website, or be easily accessible from a conspicuous link.
  • Identify the categories of personal information that the service collected.
  • Identify the categories of third-parties with whom the service might disclose the collected information.
  • Explain how users could access and modify the collected information, if possible.
  • Explain how the service notified users of material changes to the privacy policy.
  • Post the effective date of the privacy policy.

These requirements may be described in detail or brushed over in a few words, but they have to be in the policy.

New Additions: Do Not Track and Third Party Tracking

If it becomes law, AB370 will add two new requirements to CalOPPA. Both relate to the issue of third party tracking, which I go into in more depth here.

First, the bill requires online service operators to disclose whether or not they honor the user’s “Do Not Track” requests, as well as other mechanisms that allow users to opt out of certain kinds of collection of personal information. In other words, even if a website technically honors “Do Not Track” requests but still finds another way to perform the same tracking function, the website must disclose that in the privacy policy. If, in the future, technology evolves such that “Do Not Track” becomes obsolete, but other tracking methods arise, online services must disclose whether or not they honor users’ requests not to be tracked.

Second, the bill requires online service operators to disclose whether third parties may be tracking the user’s activities on the service.  If a third party, such as an advertising service, is tracking the user over time and across different sites, collecting personal information, those sites must let the user know.

What Does This Mean To You?

Nothing, yet. The California State Senate passed the bill unanimously, but it still needs to go to the California Assembly. However, this one has steam, and it’s coming.

If you’re a website operator, a mobile app developer, if you run a small business with an online presence — these are important changes. You need to check your privacy policy and make sure it addresses both third party tracking and whether or not you respect your users’ requests to opt out. If your site does not permit third party tracking, that’s fine: mention it in the policy. If your site does not respect your users’ Do Not Track requests, that’s fine: put it in the policy.

If you’re a user, this means two things. First, it means privacy policies may get a little longer. They’ve been getting longer over the years anyway as more legal requirements have necessitated adding sections. However, now privacy policies are about to get a little more interesting, and will let you know who’s watching you and whether or not you can opt out. And that’s never a bad thing.

 

Big data, privacy, and the FTC

Recently, FTC Chairwoman Edith Ramirez announced that the FTC will be aggressively policing companies with control over large databases of information, and will crack down on companies that don’t practice what they preach with consumer data use and data security. This announcement comes right on the heels of FTC member Julie Brill’s op-ed in the Washington Post about the agency’s new standards of transparency for data brokers.

This should not surprise anyone, or cause anyone to panic. Large data brokers — “big data” — is a major privacy battleground, and if you’re not aware of that right now, you’re not paying attention. Companies that control, and can aggregate and analyze, vast quantities of different kinds of data are using their abilities to innovate, and we are only beginning to see how big data can be used for exciting predictive analytics (analyzing trends in current data in order to predict future trends). However, these same predictive analytic abilities also pose a privacy risk.

For example, since 2008, Google has been monitoring its search data to help fight the flu. Using pattern analysis on aggregated search queries, Google has started correlating search trends to flu outbreaks. As it has amassed more and more data, Google began doing predictive analytics on these search trends. It is now able to predict a flu outbreak based on these search patterns /more quickly than/ the CDC can (because in the real world, people often put off going to the doctor when they have the sniffles, but they’ll do a Google search for home remedies). Because Google Flu Trends can predict large outbreaks, the CDC can get flu vaccine to affected cities in time.

That’s the positive side of big data. One of the negative sides is the way a store may aggregate and monitor its customers’ purchases so closely that it can predict changes in an individual’s health. For instance, Target assigns each of its customers a “pregnancy prediction” score based on their other purchases, such as lotion or vitamins, and begins to send coupons and advertisements accordingly. A customer became very angry upon receiving coupons from Target for diapers and cribs addressed to his teenage daughter, who had not yet disclosed to him that she was pregnant, and had certainly not intentionally disclosed it to Target or authorized Target to disclose her health information.

The FTC has to strike a delicate balance here between the beneficial promises of big data and the very real privacy issues that predictive analytics raises. In taking a position of aggressive policing, Chairwoman Ramirez has said that she wants to enable the FTC to “get out of the way of innovation while making sure that consumer privacy is respected.”

So what does this mean to you?

If you’re a consumer: not much. There are a lot of companies, and more each day, gathering your information, aggregating your data, and performing predictive analytics. All this means is that the FTC will be doing more stringent enforcement of these companies’ privacy practices, and that’s a good thing for consumers.

If you’re a big data company, or trying to become one, this is more interesting to you. As they say, forewarned is forearmed: you know that the FTC will be policing big data aggressively. Take your fate into your own hands.

  • Start by building privacy controls in from the beginning. Privacy by design is one of the FTC’s major cornerstones to good organizational privacy. If you approach your business venture knowing that there will be privacy concerns and build in ways to mitigate and manage those privacy concerns from the start, you’ll be much better off as your business grows.
  • Communicate your privacy practices clearly to your users. You don’t have to hold back from doing the things that make your business work. You just need to tell your users what things you’re doing, and allow them to make informed decisions. When users can make informed decisions based on clearly presented facts, they are more likely to trust you with their information.
  • Establish good internal privacy and security policies so that your company can keep its promises. You may have the greatest privacy policy in the world, but if your employees aren’t reading it and don’t know what your promises are, then they won’t be able to follow it. Your application developers won’t be able to build privacy controls into new updates if they don’t know the rules. Your network and firewall administrators can’t set up security policies that are in line with what you have promised.
  • Maintain good data security practices, so that your users aren’t harmed when they give up their data to you. Inadequate data security is a problem for any company, but it can be catastrophic for big data brokers who have vast stores of consumer data. The FTC will be specifically looking for companies who are putting their customers at risk by failing to keep data secure, so set up good data encryption schemes, patch your equipment properly, and educate your internal users on good practices.