Category: Legislation

India’s Intermediary Guidelines

I’ve signed on to Access Now’s letter to the Indian Ministry of Electronics and Information Technology, asking the Government of India to withdraw the draft amendments proposed to the Information Technology (Intermediary Guidelines) Rules.

As they say in their press release:

Today’s letter, signed by an international coalition of 31 organizations and individuals, explains how the proposed amendments threaten fundamental rights and the space for a free internet, while not addressing the problems that the Ministry aims to resolve. A key concern is the requirement for intermediaries to “enable tracing out of such originator” of content that an intermediary hosts, which could lead to demands that providers weaken the security features of their products and services. This threat to privacy would in turn endanger free expression.

Today, a global coalition led by civil society and technology experts sent a letter asking the government of Australia to abandon plans to introduce legislation that would undermine strong encryption. The letter calls on government officials to become proponents of digital security and work collaboratively to help law enforcement adapt to the digital era.

In July 2017, Prime Minister Malcolm Turnbull held a press conference to announce that the government was drafting legislation that would compel device manufacturers to assist law enforcement in accessing encrypted information. In May of this year, Minister for Law Enforcement and Cybersecurity Angus Taylor restated the government’s priority to introduce legislation and traveled to the United States to speak with companies based there.

Today’s letter signed by 76 organizations, companies, and individuals, asks leaders in the government “not to pursue legislation that would undermine tools, policies, and technologies critical to protecting individual rights, safeguarding the economy, and providing security both in Australia and around the world.” (Read the full announcement here)

I’m pleased to have joined in this effort by Accessnow, and you can sign, too, at Especially if you are Australian, I encourage you to do so.

Threat Model Thursday: Architectural Review and Threat Modeling

For Threat Model Thursday, I want to use current events here in Seattle as a prism through which we can look at technology architecture review. If you want to take this as an excuse to civilly discuss the political side of this, please feel free.

Seattle has a housing and homelessness crisis. The cost of a house has risen nearly 25% above the 2007 market peak, and has roughly doubled in the 6 years since April 2012. Fundamentally, demand has outstripped supply and continues to do so. As a city, we need more supply, and that means evaluating the value of things that constrain supply. This commentary from the local Libertarian party lists some of them.

The rules on what permits are needed to build a residence, what housing is acceptable, or how many unrelated people can live together (no more than eight) are expressions of values and priorities. We prefer that the developers of housing not build housing rather than build housing that doesn’t comply with the city’s Office of Planning and Community Development 32 pages of neighborhood design guidelines. We prefer to bring developers back after a building is built if the siding is not the agreed color. This is a choice that expresses the values of the city. And because I’m not a housing policy expert, I can miss some of the nuances and see the effect of the policies overall.

Let’s transition from the housing crisis here in Seattle to the architecture crisis that we face in technology.

No, actually, I’m not quite there. The city killed micro-apartments, only to replace them with … artisanal micro-houses. Note the variation in size and shape of the two houses in the foreground. Now, I know very little about construction, but I’m reasonably confident that if you read the previous piece on micro-housing, many of the concerns regulators were trying to address apply to “True Hope Village,” construction pictured above. I want you, dear reader, to read the questions about how we deliver housing in Seattle, and treat them as a mirror into how your organization delivers software. Really, please, go read “How Seattle Killed Micro-Housing” and the “Neighborhood Design Guidelines” carefully. Not because you plan to build a house, but as a mirror of your own security design guidelines.

They may be no prettier.

In some companies, security is valued, but has no authority to force decisions. In others, there are mandatory policies and review boards. We in security have fought for these mandatory policies because without them, products ignored security. And similarly, we have housing rules because of unsafe, unsanitary or overcrowded housing. To reduce the blight of slums.

Security has design review boards which want to talk about the color of the siding a developer installed on the now live product. We have design regulation which kills apodments and tenement housing, and then glorifies tiny houses. From a distance, these rules make no sense. I didn’t find it sensible, myself. I remember a meeting with the Microsoft Crypto board. I went in with some very specific questions regarding parameters and algorithms. Should we use this hash algorithm or that one? The meeting took not five whole minutes to go off the rails with suggestions about non-cryptographic architecture. I remember shipping the SDL Threat Modeling Tool, going through the roughly five policy tracking tools we had at the time, discovering at the very last minute that we had extra rules that were not documented in the documents that I found at the start. It drives a product manager nuts!

Worse, rules expand. From the executive suite, if a group isn’t growing, maybe it can shrink? From a security perspective, the rapidly changing threat landscape justifies new rules. So there’s motivation to ship new guidelines that, in passing, spend a page explaining all the changes that are taking place. And then I see “Incorporate or acknowledge the best features of existing early to mid-century buildings in new development.” What does that mean? What are the best features of those buildings? How do I acknowledge them? I just want to ship my peer to peer blockchain features! And nothing in the design review guidelines is clearly objectionable. But taken as a whole, they create a complex and unpredictable, and thus expensive path to delivery.

We express values explicitly and implicitly. In Seattle, implicit expression of values has hobbled the market’s ability to address a basic human need. One of the reasons that embedding is effective is that the embedded gatekeepers can advise, interpret in relation to real questions. Embedding expresses the value of collaboration, of dialogue over review. Does your security team express that security is more important than product delivery? Perhaps it is. When Microsoft stood down product shipping for security pushes, it was an explicit statement. Making your values explicit and debating prioritization is important.

What side effects do your security rules have? What rule is most expensive to comply with? What initiatives have you killed, accidentally or intentionally?

New Cyber Security Bill: Crowdsource Analysis?

A lot of people I trust are suggesting that the “Collins-Lieberman” bill has a substantial chance of passing. I have some really interesting (and time-consuming) work tasks right now, and so I’m even more curious than usual what you all think, especially how this

According to the press release, the “Collins-Lieberman” bill would:

  • The Department of Homeland Security (DHS) to assess the risks and vulnerabilities of critical infrastructure systems—whose disruption from a cyber attack would cause mass death, evacuation, or major damage to the economy, national security, or daily life—to determine which should be required to meet a set of risk-based security standards. Owners/operators who think their systems were wrongly designated would have the right to appeal.
  • DHS to work with the owners/operators of designated critical infrastructure to develop risk-based performance requirements, looking first to current standards or industry practices. If a sector is sufficiently secured, no new performance requirements would be developed or required to be met.
  • The owners of a covered system to determine how best to meet the performance requirements and then verify that it was meeting them. A third-party assessor could also be used to verify compliance, or an owner could choose to self-certify compliance.
  • Current industry regulators to continue to oversee their industry sectors.
  • Information-sharing between and among the private sector and the federal government to share threats, incidents, best practices, and fixes, while maintaining civil liberties and privacy.
  • DHS to consolidate its cybersecurity programs into a unified office called the National Center for Cybersecurity and Communications.
  • The government to improve the security of federal civilian cyber networks through reform of the Federal Information Security Management Act.

Some of that, like risk-based security standards, sounds potentially tremendously positive. There are some clear risks, like DHS will make a best-practices table of risk management activity without any focus on outcomes, and then classify it.

Other bits, like information sharing, sounds worrisome, because the authors clearly know that there’s a risk of privacy and liberty impacts. It’s not clear what the data to be shared is. If that’s (for example) “Verisign has been pwned using a 3-year old Flash expliot” there’s minimal impact to liberty. (Of course, since they haven’t said anything, we don’t know how Verisign was owned.) If it’s “We suspect Kevin Mitnick, then that’s both less useful and more privacy impactful.

Stepping back, where should I look for analysis? Have you looked at the bill? What does it do for the New School pillars? As a reminder, those are:

  • Learning from other professions, such as economics and psychology, to unlock the problems that stymie the information security field. The way forward cannot be found solely in mathematics or technology.
  • Sharing objective data and analysis widely. A fetish for secrecy has held us back.
  • The embrace of the scientific method for solving important problems. Analyzing real world outcomes is the best way for information security to become a mature discipline.

In other words, how New School is this bill?

'Experts' misfire in trying to shoot down Charney's 'Internet Security Tax' idea

Industry ‘experts’ misfired when they criticized Microsoft’s Scott Chareney’s “Internet Security Tax” idea. Q: How many of these ‘experts’ know any thing about information economics and public policy responses to negative externalities? A: Zero. Thus, they aren’t really qualified to comment. This is just one small case in the on-going public policy discussions regarding economics of information security, but given the reaction of the ‘experts’, this was a step backward.

Continue reading

Green Dam

Update 26 June 2009: The status of Green Dam’s optionality is still up in the air.  See, for example, this news story on PC makers’ efforts to comply, which points out that

Under the order, which was given to manufacturers in May and publicly released in early June, producers are required to pre-install Green Dam or supply it on disc with every PC sold in China from July 1.

Last week, it appeared the government backed away from requiring compulsory installation by users, but manufacturers are still being required to provide the software.

I suspect that there will be at least one more update to this post before all is said and done.

Update 17 June 2009Green Dam is now to be optional, but installed-by-default.

There’s a great deal of discussion in China right now about the new government-mandated “Green Dam” Internet filtering software that must be installed on all PC’s in the People’s Republic of China

Every PC in China could be at risk of being taken over by malicious hackers because of flaws in compulsory government software.

The potential faults were brought to light by Chinese computer experts who said the flaw could lead to a “large-scale disaster”.

The Chinese government has mandated that all computers in the country must have the screening software installed.

It is intended to filter out offensive material from the net.

I was in a taxi in Beijing a couple days ago and the driver was listening to a call-in/talk radio show whose topic was the software and its flaws/weaknesses.  My post, however, had to wait until I returned states-side due to this ‘blog being blocked by the three different connections I tried to access it while I was in China.

The consensus about this software among the locals that I spoke to is that it will be widely ignored, except in places like primary schools and some government offices.

There is so much to say about this, however, that I almost don’t know where to begin.  First, there is the issue of externalities.  The benefit from this software are the government censors.  The cost, however, will be borne by those whose machines are rendered less stable, less secure, and less useful (due to the censoring).  This is the opposite of the theoretical goal of regulation–the transfer externalities back onto their creators, not the other way around.

The results here may be even more toxic than observers currently realize, however.  By demanding compliance even when it does direct harm to those who must comply, the government undermines the loyalty of the citizenry and its own credibility.  It may only be one straw on the camel of Chinese citizens’ discontent, but eventually, there will be a straw that breaks the camel’s back.  This software has re-energized the domestic debate over the role of government censorship and whether their goal is to keep the populace safe or merely in-line.

Similarly, there is a lesson here for security and risk managers.  Namely, policies must also be perceived as benefiting those they govern.  Corporations whose policies are too obviously unfair or which demonstrate a contempt for employees produce similar disloyalty.  While it may not be immediately obvious in the current job market–people generally won’t quit in protest if they can’t find another job–that makes the effect worse.  A grumbling workforce is an unproductive workforce.

Yes, we must achieve our goals, in my case protecting information, and the combination of reduced budgets and nervous employees makes it that much harder to achieve results.  But in times like these, we also need to tread more lightly than ever since the resisters of policy–those employees who are more likely to be a risk–are more likely to stay with us and undermine it from within.

So, as ever, when we are dealing with security, the mantra remains, “People, process and technology–in that order.” Any attempt to attack the problem otherwise frequently produces unintended–and often


I don’t know what my employer’s corporate stance is going to be, but we have a significant white collar presence in China, so will probably be unable to ignore the problem.

When asked, I will argue that we already perform this filtering on our corporate proxy servers, but it does not change the fact that the government has created a huge externality for their population and for companies operating here as part of a futile attempt to prevent Chinese citizens from viewing porn or dissident political commentary–not necessarily in that order, IMHO.

Statistics Police?!

From Gelman’s blog:

U.K. Sheriff Cites Officials for Serious Statistical Violations

I don’t know if we need an “office” of information assurance in the government sector, but it would be nice to have some penalty on the books for folks who abuse basic common sense statistical principles. Of course, the *real* answer lies in education and disclosure, but history would suggest that the Enlightenment Bus is usually late and tends to make many unscheduled detours.

Oh, and just so you can share in my mental pain, here’s the song that will probably be stuck in my head for the rest of the day now:

The Eyes of Texas Are on Baseboard Management Controllers? WHAT??!!!


**UPDATE:  It looks like the “vendor language” around Section Six has been struck!

Given Bejtlich’s recent promises, I thought we’d take a quick but pragmatic look at why risk assessments, even dumb, back-of-the-envelope assessments, might just be a beneficial thing.

As you probably know, the guys here at NewSchool and the guys at sister site EmergentChaos are very interested in the government regulation of cyberspace.  Oh, we also happen to be pretty good with the information risk stuff, too.  So I’m sure you wouldn’t be surprised that we spent some time looking over what one of the biggest, most influential states in the Union, Texas (Austin is also one of my most favorite places thanks to my friend Joe Visconti), is doing about legislating information security.  Currently they have a bill in consideration, HB1830S.  Highlights here:

HB1830S has some pretty good stuff in it.  The kind of legislation that tends to make sense, even if you are a “government hands off” kind of guy like I am.

Section 2 is about background checks and having policies and so forth.  This is wonderful, it addresses about the only control we have against Internal threat agents with significant privileges.

Section 3 seems to excuse information security information (like specific vulnerabilities) from the public record.  I’m all for some level of disclosure here (Something like the letter grades the federal government releases is fine), but really, the citizens of the state don’t need particulars.

Section 4 talks about what InfoSec information should be confidential and talks about vendor relationships.  After working on some state RFPs  (not Texas) and watching how specific requirement for a “Penetration Test” was awarded to someone who, in their RFP, specifically said that they were only going to only perform a “Vulnerability Assessment”, I appreciate these sorts of clauses.

Section 5 covers internal state reporting concerns for vuln data, great.

SECTION SIX WHAT THE !@#%^!@@#$* IS THIS???!!!

“Government Code, to require that the biennial operating plan describe the state agency’s current and proposed projects for the biennium, including how the projects will address certain matters, including using, to the fullest extent, technology owned or adapted by other state agencies, including closed loop event management technology that secures, logs, and provides audit management of baseboard management controllers and consoles of cyber assets.”

Let’s parse that and read it again:

“Government Code, to require that the biennial operating plan describe the state agency’s current and proposed projects for the biennium, including how the projects will address certain matters,…”

Looking good, it’s always nice to have a plan.

“…including using, to the fullest extent, technology owned or adapted by other state agencies,…”

Great! I’m all for sharing information among security professionals, that’s pretty much one of the fundamental pillars of the New School.

“…including closed loop event management technology that secures, logs, and provides audit management of baseboard management controllers and consoles of cyber assets.”

Wait, what?

Ok, I’ve heard of closed loop processing in Business Intelligence (A system is said to perform closed-loop processing if the system feeds information back into itself).  I’ve heard the phrase Closed-Loop in SOA.  But I’m sorry, the use of “closed loop event management technology that secures, logs, and provides audit management of baseboard management controllers” sounds like somebody lifted it from a vendor brochure.

Also, I know that this blog generally attracts some of the best and most forward thinking InfoSec readers/professionals – even if you disagree with us.  But if you need to go look up what a baseboard management controller  (BMC) is and does, to remind yourself, go right ahead.  I had to.

Now read the rest of HB1830S highlights there and put Section Six in context.

Is it just me, or does this seem like someone in Texas is trying to legislate the use of a specific vendor’s rather esoteric and specific security control?  I mean, even if BMC is really important in, say, SCADA systems – is there a reason that the dozens (?) of other agencies would have to waste their money on this?

And why legislate this specific technology?  Shouldn’t the agency security management be able to do their own risk assessments and prioritize based on the significant threats that, you know, they’re ACTUALLY SEEING?  And I’m not asking for Forests of Bayesian Belief Networks to establish risk and vulnerability information via Monte Carlo simulations here, I’m asking for a basic risk-based sanity check to make decisions, decisions based in reality, not fear.  I mean, a quick poll of Security pros on Twitter about the BMC and so far nobody has claimed to ever seen one piece of exploit code, more or less heard of an actual *incident*.  Now I’m sure that the State of Texas does a great job with Information Security and all, but I’m willing to bet good money that the BMC’s of their systems is the least of their security problems.

Bottom line, Legislating disclosure, policy, and even ensuring critical processes are in place is a useful endeavor, and the rest of HB1803 does a good job.  But legislating a specific technology is bad for a couple of reasons:

1.)  It removes management’s ability to expend resources on the actual problems they have. You are legislating without the context of risk, even poorly derived risk statements.

2.)  If it takes an act of legislature to force adoption, it will take a similar or more difficult act of politics to remove that technology when it’s outlived it’s usefulness (and one wonders if BMC securing technology would EVER be useful except in fringe cases).

Things Are Tough, Don’t Waste Taxpayer Money, Please!

HB1830S could be a good piece of legislation.  Strike the BMC aspect of Section Six and it becomes more than reasonable.  Heck, add “to the fullest extent POSSIBLE” or “to the extent that’s REASONABLE” and ask state CISO’s to provide Threat Event Metrics for the BMC if you want.  But please Texas, whatever this vendor is paying you in lobbying perks – it’s not worth the waste and hassle and the risk of derision from the parts of the Information Security community that actually happen to be concerned with public safety.