‘No need’ to tell the public(?!?)

When Andrew and I wrote The New School, and talked about the need to learn from other professions, we didn’t mean for doctors to learn from ‘cybersecurity thought leaders’ about hiding their problems:

…Only one organism grew back. C. auris.

It was spreading, but word of it was not. The hospital, a specialty lung and heart center that draws wealthy patients from the Middle East and around Europe, alerted the British government and told infected patients, but made no public announcement.

“There was no need to put out a news release during the outbreak,” said Oliver Wilkinson, a spokesman for the hospital.

This hushed panic is playing out in hospitals around the world. Individual institutions and national, state and local governments have been reluctant to publicize outbreaks of resistant infections, arguing there is no point in scaring patients — or prospective ones…

Dr. Silke Schelenz, Royal Brompton’s infectious disease specialist, found the lack of urgency from the government and hospital in the early stages of the outbreak “very, very frustrating.”

“They obviously didn’t want to lose reputation,” Dr. Schelenz said. “It hadn’t impacted our surgical outcomes.” (“A Mysterious Infection, Spanning the Globe in a Climate of Secrecy“, NYTimes April 6, 2019)

This is the wrong way to think about the problem. Mr. Wilkinson (as quoted) is wrong. There is a fiduciary duty to tell patients that they are at increased risk of C. auris if they go to his hospital.

Moreover, there is a need to tell the public about these problems. Our choices, as a society, kill people. We kill people when we allow antibiotics to be used to make fatter cows or when we allow antifungals to be used on crops.

We can adjust those choices, but only if we know the consequences we are accepting. Hiding outcomes hinders cybersecurity, and it’s a bad model for medicine or public policy.

(Picture courtesy of Clinical Advisor. I am somewhat sorry for my use of such a picture here, where it’s unexpected.))


Congratulations to the Hayabusa2 mission team, who flew to an asteroid, dropped multiple rovers, an impactor and a separate camera satellite to observe the impactor. The Hayabusa2 then flew around, to the far side of the asteroid to avoid ejecta from the impactor. In a few weeks, Hayabusa2 will probably land, collect more samples and then fly back to Earth.

For more: Hayabusa 2 page at the Japan Aerospace Exploration Agency, and don’t miss the “major onboard instruments page, including an ion engine using 1/10th the power of a chemical propellant, and fixes to malfunctions that happened after 15,000 hours of operation; a better seal on the collection horn and more.
A Japanese spacecraft may have just blown a crater in a distant asteroid (Science Mag)

Books Worth Your Time (Q1 2019)


  • Making Software “What Really Works, and Why We Believe It” by Andy Oram and Greg Wilson. This collection of essays is a fascinating view into the state of the art in empirical analysis software engineering.
  • Agile Application Security by Laura Bell, Michael Brunton-Spall, Rich Smith and Jim Bird. A really good overview of the many moving pieces in an agile SDL. Good enough that I bought a paper copy to augment the ebook. (Also, sometimes redundant, and says nice things about my work.)
  • Click Here to Kill Everybody by Bruce Schneier. Thought-provoking survey of the problems that come from the book above not being better read. More seriously, we haven’t scaled application security, and even if we do, there will be bad developers who’ll do a crappy job at building things. What can we do about that as a society? I don’t like all of Schneier’s answers, but the reasoning is sound.


  • Trust Me, I’m Lying: Confessions of a Media Manipulator by Ryan Holiday lays out the toolbox of the fellow who used to run marketing for American Apparel. Shows how guerrilla marketing works in the age of Twitter, and outlines techniques now being used to screw up elections and people’s lives.
  • The Internet of Garbage by Sarah Jeong. As a summary of the problems and challenges of the internet, it’s aged sadly well since 2015.
  • The Tangled Tree: A Radical New History of Life. We’re used to thinking that genes are passed on from parents, but as David Quammen explains, there’s also horizontal gene transfer (NIH, Wikipedia). Really fascinating history of both science and the personalities involved. Recommended despite the writing being somewhat rocky and uneven – these are hard topics and I do not envy the author’s task of making an accessible and interesting read.
  • Things We Think About Games by Will Hindmarch and Jeff Tidball is 140 micro-essays about games. Some I loved, some I hated, but I enjoyed the heck out of it.


As it turns out, all three fiction books are re-imaginings of other stories. If you find that wicked annoying, these are not for you.

  • The Queens of Innis Lear, by Tessa Gratton is a re-telling of Lear from the perspective of his daughters.
  • A Study in Honor, Claire O’Dell re-tells the Holmes/Watston story in the aftermath of a second American Civil War.
  • Spinning Silver, Naomi Novik is a re-telling of the Rumplestiltskin tale. (Hugo nominated)

That’s what I read last quarter that I want to share. What was memorable for you?

Leave Those Numbers for April 1st

“90% of attacks start with phishing!*” “Cyber attacks will cost the world 6 trillion by 2020!”

We’ve all seen these sorts of numbers from vendors, and in a sense they’re April Fools day numbers: you’d have to be a fool to believe them. But vendors quote insane because there’s no downside and much upside. We need to create more and worse downside, and the road there lies through losing sales.

We need to call vendors on these number, and say “I’m sorry, but if you’d lie to me about that, what about the numbers you’re claiming that are hard to verify? The door is to your left.”

If we want to change the behavior, we have to change the impact of the behavior. We need to tell vendors that there’s no place for made up numbers, debunked numbers, unsupported numbers in our buying processes. If those numbers are in their sales and marketing material, they’re going to lose business for it.

* This one seems to trace back to analysis that 90% of APT attacks in the Verizon DBIR started with phishing, but APT and non-APT attacks are clearly different.

20 Years of STRIDE: Looking Back, Looking Forward

“Today, let me contrast two 20-year-old papers on threat modeling. My first paper on this topic, “Breaking Up Is Hard to Do,” written with Bruce Schneier, analyzed smart-card security. We talked about categories of threats, threat actors, assets — all the usual stuff for a paper of that era. We took the stance that “we experts have thought hard about these problems, and would like to share our results.”

Around the same time, on April 1, 1999, Loren Kohnfelder and Praerit Garg published a paper in Microsoft’s internal “Interface” journal called “The Threats to our Products.” It was revolutionary, despite not being publicly available for over a decade. What made the Kohnfelder and Garg paper revolutionary is that it was the first to structure the process of how to find threats. It organized attacks into a model (STRIDE), and that model was intended to help people find problems, as noted…”

Read the full version of “20 Years of STRIDE: Looking Back, Looking Forward” on Dark Reading.

India’s Intermediary Guidelines

I’ve signed on to Access Now’s letter to the Indian Ministry of Electronics and Information Technology, asking the Government of India to withdraw the draft amendments proposed to the Information Technology (Intermediary Guidelines) Rules.

As they say in their press release:

Today’s letter, signed by an international coalition of 31 organizations and individuals, explains how the proposed amendments threaten fundamental rights and the space for a free internet, while not addressing the problems that the Ministry aims to resolve. A key concern is the requirement for intermediaries to “enable tracing out of such originator” of content that an intermediary hosts, which could lead to demands that providers weaken the security features of their products and services. This threat to privacy would in turn endanger free expression.