4 stories
·
0 followers

Falling in love with Rust

1 Comment and 2 Shares
Comments

Falling in love with Rust

Let me preface this with an apology: this is a technology love story, and as such, it’s long, rambling, sentimental and personal. Also befitting a love story, it has a When Harry Met Sally feel to it, in that its origins are inauspicious…

First encounters

Over a decade ago, I worked on a technology to which a competitor paid the highest possible compliment: they tried to implement their own knockoff. Because this was done in the open (and because it is uniquely mesmerizing to watch one’s own work mimicked), I spent way too much time following their mailing list and tracking their progress (and yes, taking an especially shameful delight in their occasional feuds). On their team, there was one technologist who was clearly exceptionally capable — and I confess to being relieved when he chose to leave the team relatively early in the project’s life. This was all in 2005; for years for me, Rust was “that thing that Graydon disappeared to go work on.” From the description as I read it at the time, Graydon’s new project seemed outrageously ambitious — and I assumed that little would ever come of it, though certainly not for lack of ability or effort…

Fast forward eight years to 2013 or so. Impressively, Graydon’s Rust was not only still alive, but it had gathered a community and was getting quite a bit of attention — enough to merit a serious look. There seemed to be some very intriguing ideas, but any budding interest that I might have had frankly withered when I learned that Rust had adopted the M:N threading model — including its more baroque consequences like segmented stacks. In my experience, every system that has adopted the M:N model has lived to regret it — and it was unfortunate to have a promising new system appear to be ignorant of the scarred shoulders that it could otherwise stand upon. For me, the implications were larger than this single decision: I was concerned that it may be indicative of a deeper malaise that would make Rust a poor fit for the infrastructure software that I like to write. So while impressed that Rust’s ambitious vision was coming to any sort of fruition at all, I decided that Rust wasn’t for me personally — and I didn’t think much more about it…

Some time later, a truly amazing thing happened: Rust ripped it out. Rust’s reasoning for removing segmented stacks is a concise but thorough damnation; their rationale for removing M:N is clear-eyed, thoughtful and reflective — but also unequivocal in its resolve. Suddenly, Rust became very interesting: all systems make mistakes, but few muster the courage to rectify them; on that basis alone, Rust became a project worthy of close attention.

So several years later, in 2015, it was with great interest that I learned that Adam started experimenting with Rust. On first read of Adam’s blog entry, I assumed he would end what appeared to be excruciating pain by deleting the Rust compiler from his computer (if not by moving to a commune in Vermont) — but Adam surprised me when he ended up being very positive about Rust, despite his rough experiences. In particular, Adam hailed the important new ideas like the ownership model — and explicitly hoped that his experience would serve as a warning to others to approach the language in a different way.

In the years since, Rust continued to mature and my curiosity (and I daresay, that of many software engineers) has steadily intensified: the more I have discovered, the more intrigued I have become. This interest has coincided with my personal quest to find a programming language for the back half of my career: as I mentioned in my Node Summit 2017 talk on platform as a reflection of values, I have been searching for a language that reflects my personal engineering values around robustness and performance. These values reflect a deeper sense within me: that software can be permanent — that software’s unique duality as both information and machine afford a timeless perfection and utility that stand apart from other human endeavor. In this regard, I have believed (and continue to believe) that we are living in a Golden Age of software, one that will produce artifacts that will endure for generations. Of course, it can be hard to hold such heady thoughts when we seem to be up to our armpits in vendored flotsam, flooded by sloppy abstractions hastily implemented. Among current languages, only Rust seems to share this aspiration for permanence, with a perspective that is decidedly larger than itself.

Taking the plunge

So I have been actively looking for an opportunity to dive into Rust in earnest, and earlier this year, one presented itself: for a while, I have been working on a new mechanism for system visualization that I dubbed statemaps. The software for rendering statemaps needs to inhale a data stream, coalesce it down to a reasonable size, and render it as a dynamic image that can be manipulated by the user. This originally started off as being written in node.js, but performance became a problem (especially for larger data sets) and I did what we at Joyent have done in such situations: I rewrote the hot loop in C, and then dropped that into a node.js add-on (allowing the SVG-rendering code to remain in JavaScript). This was fine, but painful: the C was straightforward, but the glue code to bridge into node.js was every bit as capricious, tedious, and error-prone as it has always been. Given the performance constraint, the desire for the power of a higher level language, and the experimental nature of the software, statemaps made for an excellent candidate to reimplement in Rust; my intensifying curiosity could finally be sated!

As I set out, I had the advantage of having watched (if from afar) many others have their first encounters with Rust. And if those years of being a Rust looky-loo taught me anything, it’s that the early days can be like the first days of snowboarding or windsurfing: lots of painful falling down! So I took deliberate approach with Rust: rather than do what one is wont to do when learning a new language and tinker a program into existence, I really sat down to learn Rust. This is frankly my bias anyway (I always look for the first principles of a creation, as explained by its creators), but with Rust, I went further: not only did I buy the canonical reference (The Rust Programming Language by Steve Klabnik, Carol Nichols and community contributors), I also bought an O’Reilly book with a bit more narrative (Programming Rust by Jim Blandy and Jason Orendorff). And with this latter book, I did something that I haven’t done since cribbing BASIC programs from Enter magazine back in the day: I typed in the example program in the introductory chapters. I found this to be very valuable: it got the fingers and the brain warmed up while still absorbing Rust’s new ideas — and debugging my inevitable transcription errors allowed me to get some understanding of what it was that I was typing. At the end was something that actually did something, and (importantly), by working with a program that was already correct, I was able to painlessly feel some of the tremendous promise of Rust.

Encouraged by these early (if gentle) experiences, I dove into my statemap rewrite. It took a little while (and yes, I had some altercations with the borrow checker!), but I’m almost shocked about how happy I am with the rewrite of statemaps in Rust. Because I know that many are in the shoes I occupied just a short while ago (namely, intensely wondering about Rust, but also wary of its learning curve — and concerned about the investment of time and energy that climbing it will necessitate), I would like to expand on some of the things that I love about Rust other than the ownership model. This isn’t because I don’t love the ownership model (I absolutely do) or that the ownership model isn’t core to Rust (it is rightfully thought of as Rust’s epicenter), but because I think its sheer magnitude sometimes dwarfs other attributes of Rust — attributes that I find very compelling! In a way, I am writing this for my past self — because if I have one regret about Rust, it’s that I didn’t see beyond the ownership model to learn it earlier.

I will discuss these attributes in roughly the order I discovered them with the (obvious?) caveat that this shouldn’t be considered authoritative; I’m still very much new to Rust, and my apologies in advance for any technical details that I get wrong!

1. Rust’s error handling is beautiful

The first thing that really struck me about Rust was its beautiful error handling — but to appreciate why it so resonated with me requires some additional context. Despite its obvious importance, error handling is something we haven’t really gotten right in systems software. For example, as Dave Pacheo observed with respect to node.js, we often conflate different kinds of errors — namely, programmatic errors (i.e., my program is broken because of a logic error) with operational errors (i.e., an error condition external to my program has occurred and it affects my operation). In C, this conflation is unusual, but you see it with the infamous SIGSEGV signal handler that has been known to sneak into more than one undergraduate project moments before a deadline to deal with an otherwise undebuggable condition. In the Java world, this is slightly more common with the (frowned upon) behavior of catching java.lang.NullPointerException or otherwise trying to drive on in light of clearly broken logic. And in the JavaScript world, this conflation is commonplace — and underlies one of the most serious objections to promises.

Beyond the ontological confusion, error handling suffers from an infamous mechanical problem: for a function that may return a value but may also fail, how is the caller to delineate the two conditions? (This is known as the semipredicate problem after a Lisp construct that suffers from it.) C handles this as it handles so many things: by leaving it to the programmer to figure out their own (bad) convention. Some use sentinel values (e.g., Linux system calls cleave the return space in two and use negative values to denote the error condition); some return defined values on success and failure and then set an orthogonal error code; and of course, some just silently eat errors entirely (or even worse).

C++ and Java (and many other languages before them) tried to solve this with the notion of exceptions. I do not like exceptions: for reasons not dissimilar to Dijkstra’s in his famous admonition against “goto”, I consider exceptions harmful. While they are perhaps convenient from a function signature perspective, exceptions allow errors to wait in ambush, deep in the tall grass of implicit dependencies. When the error strikes, higher-level software may well not know what hit it, let alone from whom — and suddenly an operational error has become a programmatic one. (Java tries to mitigate this sneak attack with checked exceptions, but while well-intentioned, they have serious flaws in practice.) In this regard, exceptions are a concrete example of trading the speed of developing software with its long-term operability. One of our deepest, most fundamental problems as a craft is that we have enshrined “velocity” above all else, willfully blinding ourselves to the long-term consequences of gimcrack software. Exceptions optimize for the developer by allowing them to pretend that errors are someone else’s problem — or perhaps that they just won’t happen at all.

Fortunately, exceptions aren’t the only way to solve this, and other languages take other approaches. Closure-heavy languages like JavaScript afford environments like node.js the luxury of passing an error as an argument — but this argument can be ignored or otherwise abused (and it’s untyped regardless), making this solution far from perfect. And Go uses its support for multiple return values to (by convention) return both a result and an error value. While this approach is certainly an improvement over C, it is also noisy, repetitive and error-prone.

By contrast, Rust takes an approach that is unique among systems-oriented languages: leveraging first algebraic data types — whereby a thing can be exactly one of an enumerated list of types and the programmer is required to be explicit about its type to manipulate it — and then combining it with its support for parameterized types. Together, this allows functions to return one thing that’s one of two types: one type that denotes success and one that denotes failure. The caller can then pattern match on the type of what has been returned: if it’s of the success type, it can get at the underlying thing (by unwrapping it), and if it’s of the error type, it can get at the underlying error and either handle it, propagate it, or improve upon it (by adding additional context) and propagating it. What it cannot do (or at least, cannot do implicitly) is simply ignore it: it has to deal with it explicitly, one way or the other. (For all of the details, see Recoverable Errors with Result.)

To make this concrete, in Rust you end up with code that looks like this:

fn do_it(filename: &str) -> Result<(), io::Error> {
    let stat = match fs::metadata(filename) {
        Ok(result) => { result },
        Err(err) => { return Err(err); }
    };                  

    let file = match File::open(filename) {
        Ok(result) => { result },
        Err(err) => { return Err(err); }
    };

    /* ... */

    Ok(())
}

Already, this is pretty good: it’s cleaner and more robust than multiple return values, return sentinels and exceptions — in part because the type system helps you get this correct. But it’s also verbose, so Rust takes it one step further by introducing the propagation operator: if your function returns a Result, when you call a function that itself returns a Result, you can append a question mark on the call to the function denoting that upon Ok, the result should be unwrapped and the expression becomes the unwrapped thing — and upon Err the error should be returned (and therefore propagated). This is easier seen than explained! Using the propagation operator turns our above example into this:

fn do_it_better(filename: &str) -> Result<(), io::Error> {
    let stat = fs::metadata(filename)?;
    let file = File::open(filename)?;

    /* ... */

    Ok(())
}

This, to me, is beautiful: it is robust; it is readable; it is not magic. And it is safe in that the compiler helps us arrive at this and then prevents us from straying from it.

Platforms reflect their values, and I daresay the propagation operator is an embodiment of Rust’s: balancing elegance and expressiveness with robustness and performance. This balance is reflected in a mantra that one hears frequently in the Rust community: “we can have nice things.” Which is to say: while historically some of these values were in tension (i.e., making software more expressive might implicitly be making it less robust or more poorly performing), through innovation Rust is finding solutions that don’t compromise one of these values for the sake of the other.

2. The macros are incredible

When I was first learning C, I was (rightly) warned against using the C preprocessor. But like many of the things that we are cautioned about in our youth, this warning was one that the wise give to the enthusiastic to prevent injury; the truth is far more subtle. And indeed, as I came of age as a C programmer, I not only came to use the preprocessor, but to rely upon it. Yes, it needed to be used carefully — but in the right hands it could generate cleaner, better code. (Indeed, the preprocessor is very core to the way we implement DTrace’s statically defined tracing.) So if anything, my problems with the preprocessor were not its dangers so much as its many limitations: because it is, in fact, a preprocessor and not built into the language, there were all sorts of things that it would never be able to do — like access the abstract syntax tree.

With Rust, I have been delighted by its support for hygienic macros. This not only solves the many safety problems with preprocessor-based macros, it allows them to be outrageously powerful: with access to the AST, macros are afforded an almost limitless expansion of the syntax — but invoked with an indicator (a trailing bang) that makes it clear to the programmer when they are using a macro. For example, one of the fully worked examples in Programming Rust is a json! macro that allows for JSON to be easy declared in Rust. This gets to the ergonomics of Rust, and there are many macros (e.g., format!, vec!, etc.) that make Rust more pleasant to use.

Another advantage of macros: they are so flexible and powerful that they allow for effective experimentation. For example, the propagation operator that I love so much actually started life as a try! macro; that this macro was being used ubiquitously (and successfully) allowed a language-based solution to be considered. Languages can be (and have been!) ruined by too much experimentation happening in the language rather than in how it’s used; through its rich macros, it seems that Rust can enable the core of the language to remain smaller — and to make sure that when it expands, it is for the right reasons and in the right way.

3. format! is a pleasure

Okay, this is a small one but it’s (another) one of those little pleasantries that has made Rust really enjoyable. Many (most? all?) languages have an approximation or equivalent of the venerable sprintf, whereby variable input is formatted according to a format string. Rust’s variant of this is the format! macro (which is in turn invoked by println!, panic!, etc.), and (in keeping with one of the broader themes of Rust) it feels like it has learned from much that came before it. It is type-safe (of course) but it is also clean in that the {} format specifier can be used on any type that implements the Display trait. I also love that the {:?} format specifier denotes that the argument’s Debug trait implementation should be invoked to print debug output. More generally, all of the format specifiers map to particular traits, allowing for an elegant approach to an historically grotty problem. There are a bunch of other niceties, and it’s all a concrete example of how Rust uses macros to deliver nice things without sullying syntax or otherwise special-casing. None of the formatting capabilities are unique to Rust, but that’s the point: in this (small) domain (as in many) Rust feels like a distillation of the best work that came before it. As anyone who has had to endure one of my talks can attest, I believe that appreciating history is essential both to understand our present and to map our future. Rust seems to have that perspective in the best ways: it is reverential of the past without being incarcerated by it.

4. include_str! is a godsend

One of the filthy aspects of the statemap code is that it is effectively encapsulating another program — a JavaScript program that lives in the SVG to allow for the interactivity of the statemap. This code lives in its own file, which the statemap code should pass through to the generated SVG. In the node.js/C hybrid, I am forced to locate the file in the filesystem — which is annoying because it has to be delivered along with the binary and located, etc. Now Rust — like many languages (including ES6) — has support for raw-string literals. As an aside, it’s interesting to see the discussion leading up to its addition, and in particular, how a group of people really looked at every language that does this to see what should be mimicked versus what could be improved upon. I really like the syntax that Rust converged on: r followed by one or more octothorpes followed by a quote to begin a raw string literal, and a quote followed by a matching number of octothorpes followed to end a literal, e.g.:

    let str = r##""What a curious feeling!" said Alice"##;

This alone would have allowed me to do what I want, but still a tad gross in that it’s a bunch of JavaScript living inside a raw literal in a .rs file. Enter include_str!, which allows me to tell the compiler to find the specified file in the filesystem during compilation, and statically drop it into a string variable that I can manipulate:

        ...
        /*
         * Now drop in our in-SVG code.
         */
        let lib = include_str!("statemap-svg.js");
        ...

So nice! Over the years I have wanted this many times over for my C, and it’s another one of those little (but significant!) things that make Rust so refreshing.

5. Serde is stunningly good

Serde is a Rust crate that allows for serialization and deserialization, and it’s just exceptionally good. It uses macros (and, in particular, Rust’s procedural macros) to generate structure-specific routines for serialization and deserialization. As a result, Serde requires remarkably little programmer lift to use and performs eye-wateringly well — a concrete embodiment of Rust’s repeated defiance of the conventional wisdom that programmers must choose between abstractions and performance!

For example, in the statemap implementation, the input is concatenated JSON that begins with a metadata payload. To read this payload in Rust, I define the structure, and denote that I wish to derive the Deserialize trait as implemented by Serde:

#[derive(Deserialize, Debug)]
#[allow(non_snake_case)]
struct StatemapInputMetadata {
    start: Vec,
    title: String,
    host: Option,
    entityKind: Option,
    states: HashMap,
}

Then, to actually parse it:

     let metadata: StatemapInputMetadata = serde_json::from_str(payload)?;

That’s… it. Thanks to the magic of the propagation operator, the errors are properly handled and propagated — and it has handled tedious, error-prone things for me like the optionality of certain members (itself beautifully expressed via Rust’s ubiquitous Option type). With this one line of code, I now (robustly) have a StatemapInputMetadata instance that I can use and operate upon — and this performs incredibly well on top of it all. In this regard, Serde represents the best of software: it is a sophisticated, intricate implementation making available elegant, robust, high-performing abstractions; as legendary White Sox play-by-play announcer Hawk Harrelson might say, MERCY!

6. I love tuples

In my C, I have been known to declare anonymous structures in functions. More generally, in any strongly typed language, there are plenty of times when you don’t want to have to fill out paperwork to be able to structure your data: you just want a tad more structure for a small job. For this, Rust borrows an age-old construct from ML in tuples. Tuples are expressed as a parenthetical list, and they basically work as you expect them to work in that they are static in size and type, and you can index into any member. For example, in some test code that needs to make sure that names for colors are correctly interpreted, I have this:

        let colors = vec![
            ("aliceblue", (240, 248, 255)),
            ("antiquewhite", (250, 235, 215)),
            ("aqua", (0, 255, 255)),
            ("aquamarine", (127, 255, 212)),
            ("azure", (240, 255, 255)),
            /* ... */
        ];

Then colors[2].0 (say) which will be the string “aqua”; (colors[1].1).2 will be the integer 215. Don’t let the absence of a type declaration in the above deceive you: tuples are strongly typed, it’s just that Rust is inferring the type for me. So if I accidentally try to (say) add an element to the above vector that contains a tuple of mismatched signature (e.g., the tuple “((188, 143, 143), ("rosybrown"))“, which has the order reversed), Rust will give me a compile-time error.

The full integration of tuples makes them a joy to use. For example, if a function returns a tuple, you can easily assign its constituent parts to disjoint variables, e.g.:

fn get_coord() -> (u32, u32) {
   (1, 2)
}

fn do_some_work() {
    let (x, y) = get_coord();
    /* x has the value 1, y has the value 2 */
}

Great stuff!

7. The integrated testing is terrific

One of my regrets on DTrace is that we didn’t start on the DTrace test suite at the same time we started the project. And even after we starting building it (too late, but blessedly before we shipped it), it still lived away from the source for several years. And even now, it’s a bit of a pain to run — you really need to know it’s there.

This represents everything that’s wrong with testing in C: because it requires bespoke machinery, too many people don’t bother — even when they know better! Viz.: in the original statemap implementation, there is zero testing code — and not because I don’t believe in it, but just because it was too much work for something relatively small. Yes, there are plenty of testing frameworks for C and C++, but in my experience, the integrated frameworks are too constrictive — and again, not worth it for a smaller project.

With the rise of test-driven development, many languages have taken a more integrated approach to testing. For example, Go has a rightfully lauded testing framework, Python has unittest, etc. Rust takes a highly integrated approach that combines the best of all worlds: test code lives alongside the code that it’s testing — but without having to make the code bend to a heavyweight framework. The workhorses here are conditional compilation and Cargo, which together make it so easy to write tests and run them that I found myself doing true test-driven development with statemaps — namely writing the tests as I develop the code.

8. The community is amazing

In my experience, the best communities are ones that are inclusive in their membership but resolute in their shared values. When communities aren’t inclusive, they stagnate, or rot (or worse); when communities don’t share values, they feud and fracture. This can be a very tricky balance, especially when so many open source projects start out as the work of a single individual: it’s very hard for a community not to reflect the idiosyncrasies of its founder. This is important because in the open source era, community is critical: one is selecting a community as much as one is selecting a technology, as each informs the future of the other. One factor that I value a bit less is strictly size: some of my favorite communities are small ones — and some of my least favorite are huge.

For purposes of a community, Rust has a luxury of clearly articulated, broadly shared values that are featured prominently and reiterated frequently. If you head to the Rust website this is the first sentence you’ll read:

Rust is a systems programming language that runs blazingly fast, prevents segfaults, and guarantees thread safety.

That gets right to it: it says that as a community, we value performance and robustness — and we believe that we shouldn’t have to choose between these two. (And we have seen that this isn’t mere rhetoric, as so many Rust decisions show that these values are truly the lodestar of the project.)

And with respect to inclusiveness, it is revealing that you will likely read that statement of values in your native tongue, as the Rust web page has been translated into thirteen languages. Just the fact that it has been translated into so many languages makes Rust nearly unique among its peers. But perhaps more interesting is where this globally inclusive view likely finds its roots: among the sites of its peers, only Ruby is similarly localized. Given that several prominent Rustaceans like Steve Klabnik and Carol Nichols came from the Ruby community, it would not be unreasonable to guess that they brought this globally inclusive view with them. This kind of inclusion is one that one sees again and again in the Rust community: different perspectives from different languages and different backgrounds. Those who come to Rust bring with them their experiences — good and bad — from the old country, and the result is a melting pot of ideas. This is an inclusiveness that runs deep: by welcoming such disparate perspectives into a community and then uniting them with shared values and a common purpose, Rust achieves a rich and productive heterogeneity of thought. That is, because the community agrees about the big things (namely, its fundamental values), it has room to constructively disagree (that is, achieve consensus) on the smaller ones.

Which isn’t to say this is easy! Check out Ashley Williams in the opening keynote from RustConf 2018 for how exhausting it can be to hash through these smaller differences in practice. Rust has taken a harder path than the “traditional” BDFL model, but it’s a qualitatively better one — and I believe that many of the things that I love about Rust are a reflection of (and a tribute to) its robust community.

9. The performance rips

Finally, we come to the last thing I discovered in my Rust odyssey — but in many ways, the most important one. As I described in an internal presentation, I had experienced some frustrations trying to implement in Rust the same structure I had had in C. So I mentally gave up on performance, resolving to just get something working first, and then optimize it later.

I did get it working, and was able to benchmark it, but to give some some context for the numbers, here is the time to generate a statemap in the old (slow) pure node.js implementation for a modest trace (229M, ~3.9M state transitions) on my 2.9 GHz Core i7 laptop:

% time ./statemap-js/bin/statemap ./pg-zfs.out > js.svg

real	1m23.092s
user	1m21.106s
sys	0m1.871s

This is bad — and larger input will cause it to just run out of memory. And here’s the version as reimplemented as a C/node.js hybrid:

% time ./statemap-c/bin/statemap ./pg-zfs.out > c.svg

real	0m11.800s
user	0m11.414s
sys	0m0.330s

This was (as designed) a 10X improvement in performance, and represents speed-of-light numbers in that this seems to be an optimal implementation. Because I had written my Rust naively (and my C carefully), my hope was that the Rust would be no more than 20% slower — but I was braced for pretty much anything. Or at least, I thought I was; I was actually genuinely taken aback by the results:

$ time ./statemap.rs/target/release/statemap ./pg-zfs.out > rs.svg
3943472 records processed, 24999 rectangles

real	0m8.072s
user	0m7.828s
sys	0m0.186s

Yes, you read that correctly: my naive Rust was ~32% faster than my carefully implemented C. This blew me away, and in the time since, I have spent some time on a real lab machine running SmartOS (where I have reproduced these results and been able to study them a bit). My findings are going to have to wait for another blog entry, but suffice it to say that despite executing a shockingly similar number of instructions, the Rust implementation has a different load/store mix (it is much more store-heavy than C) — and is much better behaved with respect to the cache. Given the degree that Rust passes by value, this makes some sense, but much more study is merited.

It’s also worth mentioning that there are some easy wins that will make the Rust implementation even faster: after I had publicized the fact that I had a Rust implementation of statemaps working, I was delighted when David Tolnay, the author of Serde, took the time to make some excellent suggestions for improvement. For a newcomer like me, it’s a great feeling to have someone with such deep expertise as David’s take an interest in helping me make my software perform even better — and it is revealing as to the core values of the community.

Rust’s shockingly good performance — and the community’s desire to make it even better — fundamentally changed my disposition towards it: instead of seeing Rust as a language to augment C and replace dynamic languages, I’m looking at it as a language to replace both C and dynamic languages in all but the very lowest layers of the stack. C — like assembly — will continue to have a very important place for me, but it’s hard to not see that place as getting much smaller relative to the barnstorming performance of Rust!

Beyond the first impressions

I wouldn’t want to imply that this is an exhaustive list of everything that I have fallen in love with about Rust. That list is much longer would include at least the ownership model; the trait system; Cargo; the type inference system. And I feel like I have just scratched the surface; I haven’t waded into known strengths of Rust like the FFI and the concurrency model! (Despite having written plenty of multithreaded code in my life, I haven’t so much as created a thread in Rust!)

Building a future

I can say with confidence that my future is in Rust. As I have spent my career doing OS kernel development, a natural question would be: do I intend to rewrite the OS kernel in Rust? In a word, no. To understand my reluctance, take some of my most recent experience: this blog entry was delayed because I needed to debug (and fix) a nasty problem with our implementation of the Linux ABI. As it turns out, Linux and SmartOS make slightly different guarantees with respect to the interaction of vfork and signals, and our code was fatally failing on a condition that should be impossible. Any old Unix hand (or quick study!) will tell you that vfork and signal disposition are each semantic superfund sites in their own right — and that their horrific (and ill-defined) confluence can only be unimaginably toxic. But the real problem is that actual software implicitly depends on these semantics — and any operating system that is going to want to run existing software will itself have to mimic them. You don’t want to write this code, because no one wants to write this code.

Now, one option (which I honor!) is to rewrite the OS from scratch, as if legacy applications essentially didn’t exist. While there is a tremendous amount of good that can come out of this (and it can find many use cases), it’s not a fit for me personally.

So while I may not want to rewrite the OS kernel in Rust, I do think that Rust is an excellent fit for much of the broader system. For example, at the recent OpenZFS Developers Summit, Matt Ahrens and I were noodling the notion of user-level components for ZFS in Rust. Specifically: zdb is badly in need of a rewrite — and Rust would make an excellent candidate for it. There are many such examples spread throughout ZFS and the broader the system, including a few in kernel. Might we want to have a device driver model that allows for Rust drivers? Maybe! (And certainly, it’s technically possible.) In any case, you can count on a lot more Rust from me and into the indefinite future — whether in the OS, near the OS, or above the OS.

Taking your own plunge

I wrote all of this up in part to not only explain why I took the plunge, but to encourage others to take their own. If you were as I was and are contemplating diving into Rust, a couple of pieces of advice, for whatever they’re worth:

  • I would recommend getting both The Rust Programming Language and Programming Rust. They are each excellent in their own right, and different enough to merit owning both. I also found it very valuable to have two different sources on subjects that were particularly thorny.
  • Understand ownership before you start to write code. The more you understand ownership in the abstract, the less you’ll have to learn at the merciless hands of compiler error messages.
  • Get in the habit of running rustc on short programs. Cargo is terrific, but I personally have found it very valuable to write short Rust programs to understand a particular idea — especially when you want to understand optional or new features of the compiler. (Roll on, non-lexical lifetimes!)
  • Be careful about porting something to Rust as a first project — or otherwise implementing something you’ve implemented before. Now, obviously, this is exactly what I did, and it can certainly be incredibly valuable to be able to compare an implementation in Rust to an implementation in another language — but it can also cut against you: the fact that I had implemented statemaps in C sent me down some paths that were right for C but wrong for Rust; I made much better progress when I rethought the implementation of my problem the way Rust wanted me to think about it.
  • Check out the New Rustacean podcast by Chris Krycho. I have really enjoyed Chris’s podcasts, and have been working my way through them when commuting or doing household chores. I particularly enjoyed his interview with Sean Griffen and his interview with Carol Nichols.
  • Check out rustlings. I learned about this a little too late for me; I wish I had known about it earlier! I did work through the Rust koans, which I enjoyed and would recommend for the first few hours with Rust.

I’m sure that there’s a bunch of stuff that I missed; if there’s a particular resource that you found useful when learning Rust, message me or leave a comment here and I’ll add it.

Let me close by offering a sincere thanks to those in the Rust community who have been working so long to develop such a terrific piece of software — and especially those who have worked so patiently to explain their work to us newcomers. You should be proud of what you’ve accomplished, both in terms of a revolutionary technology and a welcoming community — thank you for inspiring so many of us about what infrastructure software can become, and I look forward to many years of implementing in Rust!


Comments
Read the whole story
nocko
62 days ago
reply
Share this story
Delete
1 public comment
skorgu
62 days ago
reply
!

Gaming Gets More Inclusive With The Launch Of The Xbox Adaptive Controller

1 Comment and 2 Shares

Without a doubt, 2018 has been a hallmark year for inclusivity in gaming. From individual platforms and games introducing more features for gamers with accessibility needs to physical hardware like the Xbox Adaptive Controller, there has never before been such a high point for inclusivity in gaming. Available at Microsoft Stores and GameStop Online for $99.99, the first-of-its-kind Xbox Adaptive Controller will be available starting today, so even more gamers from around the world can engage with their friends and favorite gaming content on Xbox One and Windows 10.

The Xbox Adaptive Controller will be available starting today:

Purchase Xbox Adaptive Controller from GameStop

Purchase Xbox Adaptive Controller from Microsoft Store

The Xbox Adaptive Controller is a product that was ideated and pioneered with inclusivity at its heart. We iterated on and refined it through close partnership with gamers with limited mobility and fan feedback, as well as guidance and creativity from accessibility experts, advocates and partners such as The AbleGamers Charity, The Cerebral Palsy FoundationCraig Hospital, Special Effect and Warfighter Engaged. Even the accessible packaging the Xbox Adaptive Controller arrives in was an entirely new approach to redefining success in product packaging—directly informed and guided by gamers with limited mobility. It’s truly the collaboration and teamwork from these individuals and groups who helped bring the Xbox Adaptive Controller to gamers around the world. And gaming, everywhere, becomes greater because of that collaborative spirit.

Xbox Adaptive Controller

To the gamers and industry professionals around the world who shared their thoughts, feelings and feedback on either the Xbox Adaptive Controller itself or the accessible packaging it ships in—thank you. From gamers like Mike Luckett, a combat veteran based in the US who tested and shared feedback on the controller through the beta program, to gamers in the UK who kindly invited us into their homes and shared which iteration of the accessible packaging they liked most—this day of launch is a thanks to all your contributions. On behalf of gamers everywhere, we share our sincere thanks.

While the response from communities, gamers and press when we introduced the controller in May was remarkable, the true impact the Xbox Adaptive Controller has had with gamers becomes clearer when attending events like E3 in Los Angeles in June, wearing an “Xbox Adaptive Controller” t-shirt. Walking the show floor to run a simple errand, you become bombarded with smiles, greetings and high-fives—shared by gamers of all types—embracing and furthering the fondness of supporting inclusivity in gaming. It’s a powerful sentiment of appreciation for inclusivity, and we’re humbled by the reception.

Xbox Adaptive Controller

Beyond the humbling praise from the gaming industry, the Xbox Adaptive Controller has been equally recognized for its innovative approach to inclusive design in gaming. In fact, just today it was announced that the V&A, the world’s leading museum of art, design and performance, has acquired the controller as part of its Rapid Response Collecting program, which collects contemporary objects reflecting major moments in recent history that touch the world of design, technology and manufacturing. It’s an honor and achievement we did not set out to accomplish but are nonetheless moved by the recognition of the team’s passionate work invested in the Xbox Adaptive Controller, helping it stand out as a truly first of its kind product—in gaming and beyond.

Let today be a celebration of inclusivity in gaming—regardless of your platform, community or game of choice. Whether you’re a gamer using the Xbox Adaptive Controller for the first time or new to gaming, welcome to the Xbox family! Inclusivity starts with the notion of empowering everyone to have more fun.  That means making our products usable by everyone, to welcome everyone, and to create a safe environment for everyone.

If you’re looking for more information on the Xbox Adaptive Controller, peripherals available today to configure it just for your use, or tips on how to get setup, we’ve got you covered. Learn more about peripherals from our hardware partners such as Logitech, RAM and PDP, used to customize your Xbox Adaptive Controller configuration, hereVisit this page to learn more about using Copilot with the Xbox Adaptive Controller. And here is some general product information to help you learn more about the Xbox Adaptive Controller. Thanks again for joining us on this incredible journey of inclusivity; see you online!

Read the whole story
nocko
77 days ago
reply
Share this story
Delete
1 public comment
DMack
77 days ago
reply
all those 3.5mm jacks and no notch
Victoria, BC

McCain

1 Comment and 3 Shares
TUCSON, AZ – MARCH 26: Senator John McCain and former Alaska state Governor Sarah Palin campaign at Pima County Fairgrounds, March 26, 2010 in Tucson, Arizona. (Photo by Darren Hauck/Getty Images)

John McSame has died.

Any decent obituary of John McCain has to be as much about the media fawning over him as about the man himself. This was a not a good man and yet no one was more lionized by the Beltway media establishment in the entire recent history of American politics; possibly no one since John F. Kennedy has received more fawning coverage and much of that for JFK was post-1963. Why McCain received this adoration may remain a mystery to historians for years because it’s completely nonsensical based on the man’s actual career. And yet, the media could never get enough of him. McCain claimed to have the most Meet the Press appearances all-time over its long run, and it’s hard to imagine that he doesn’t, although in 2007, NBC said it was Bob Dole, but that was before another decade of weekly McCain appearances.

McCain was born in the Canal Zone in 1936 to a Naval Air officer. A military brat, the family traveled around all the time and finally McCain went to high school in the DC area. He entered the Naval Academy, as his father and grandfather had done. He was largely terrible, graduating 894th in a class of 899. McCain was a partier and a ladies’ man who took the social life more seriously than his early air training. But he managed to become a competent, if risk-taking pilot. He got married in 1965 and then went to Vietnam, where he asked for a combat assignment. On his 23rd mission, he was shot down over North Vietnam and nearly died, first from his injuries and landing in water, and then from being beaten after the villagers rescued him. After all, he was raining fire on them and killing them left and right. The reaction of the villagers is entirely understandable.

When the North Vietnamese discovered that McCain’s father was an admiral, he became a showpiece for them and he was treated a little better and received medical care for his wounds, although he certainly received his share of brutality after that too. He was moved from prison to prison, including two years of solitary confinement than began in March 1968. Torture started to break him. He considered suicide but was interrupted while preparing for it. He signed a bogus confession, and was later ashamed, but of course no one can really stand up to torture. He finally was released from prison in 1973, after five hellish years. Whether all this makes him a “hero” or not is a question I guess you will have to decide. I certainly don’t question the man’s toughness or personal bravery. I don’t find the term hero particularly useful and I’m unclear how this qualifies someone for the title as opposed to, I don’t know, organizing people to lift themselves out of poverty, but this is a battle I recognize I will never win. In any case, I don’t think his history should have meant anything when it came to political respectability but of course it did.

McCain returned to the U.S. and went back to his high-partying ways, making up for lost time. He had affair after affair, destroying his marriage to the woman who had waited all those years for him, a woman who had suffered through a severe car crash in the meantime. But McCain was now a celebrity and had huge ambition to take advantage of that. He entered the political world in 1977 when he became the Navy’s liaison to the Senate, introducing him to basically everybody. Still married, he began dating Cindy Hensley, the daughter of a very wealthy beer distributor. He pressured his first wife into a divorce, married Cindy, and then used her money to finance his burgeoning political career. How sure was this political career by this time? His groomsmen were Gary Hart and William Cohen. He resigned from the Navy in 1981 and prepared to run for office. Cindy’s dad hired McCain into his company and that put him firmly in the Arizona elite, where he got to know such useful people as the financier Charles Keating. Cindy funded his 1982 entrance into electoral politics, when he won an election to Congress from Arizona-01.

When McCain entered Congress, he was really nothing more than a bog-standard Republican. For a guy who made his real reputation on foreign policy, his foreign policy stances were terrible. He embraced right-wing dictators in Latin America. He traveled to Chile to meet with Augusto Pinochet. When Reagan illegally funded the Contras to overthrow the Sandinista Revolution, McCain loved it. McCain also showed his bipartisan maverickocity by taking a really brave stand—opposing making Martin Luther King Day a national holiday! Why, I haven’t seen such Republican leadership since Dick Cheney did all he could to help apartheid South Africa! McCain later said he regretted this—in hindsight, I believe him. But it doesn’t really matter what your views are when something becomes so normalized that it is universal. This is like saying one opposes slavery in 2018. Who doesn’t?!? What matters is what you did when it was time to make the decision? And as he would through most, albeit not all of his career, he failed miserably when the rubber met the road.

Of course, none of this hurt him with right-wing Arizona voters and he won election to the Senate in 1986. He continued with many of his well-defined interests. His noted love of gambling and close ties with the gambling industry led him to sponsor the Indian Gaming Regulatory Act of 1988. He took a seat on the Armed Service Committee, using it to make his love of American militarism his top policy priority. He was a big supporter of Gramm-Rudman, forcing automatic budget cuts when budget deficits occurred, deficits that were likely given McCain’s love of big things that go boom and cost lots of money. He made a positive impression on the media early on, leading to speculation that George Bush could name his VP candidate in 1988. Let’s face it—that would have been a much better choice for Bush than Dan Quayle!

What got John McCain his first major spotlight in the Senate? Being a member of the Keating Five. His Arizona buddy Keating had given McCain well over $100,000 in campaign contributions, had given him free flights, and all the other quasi-legal or slightly illegal perks of political influence. So when the federal government came after Keating for his crimes, he sought to cash in with McCain and the other politicians he had purchased. With that kind of corruption, you can see why the media fawned over him! Basically, McCain, Alan Cranston, John Glenn, Dennis DeConcini, and Donald Riegle intervened to protect the interests of Charles Keating in the Savings and Loan scandal, getting the Federal Home Loan Bank Board to back off its investigation. Keating had contributed $1.3 million to the five senators and this came out when the company collapsed in 1989, defrauding 23,000 bondholders. But hey, McCain actually did work with Democrats on that one! The Phoenix New Times’ Tom Fitzpatrick, in 1989:

You’re John McCain, a fallen hero who wanted to become president so desperately that you sold yourself to Charlie Keating, the wealthy con man who bears such an incredible resemblance to The Joker.

Obviously, Keating thought you could make it to the White House, too.

He poured $112,000 into your political campaigns. He became your friend. He threw fund raisers in your honor. He even made a sweet shopping-center investment deal for your wife, Cindy. Your father-in-law, Jim Hensley, was cut in on the deal, too.

Nothing was too good for you. Why not? Keating saw you as a prime investment that would pay off in the future

So he flew you and your family around the country in his private jets. Time after time, he put you up for serene, private vacations at his vast, palatial spa in the Bahamas. All of this was so grand. You were protected from what Thomas Hardy refers to as “the madding crowd.” It was almost as though you were already staying at a presidential retreat.

Like the old song, that now seems “Long ago and far away.”

Since Keating’s collapse, you find yourself doing obscene things to save yourself from the Senate Ethics Committee’s investigation. As a matter of course, you engage in backbiting behavior that will turn you into an outcast in the Senate if you do survive.

Ouch.

It’s amazing that all the media lauding of McCain over the past decades totally forgets his corruption! But the people of Arizona didn’t care either and he won reelection in 1992 with 56% of the vote.

Here’s the key thing to know about John McCain—he really loved killing brown people around the world to show other nations how tough the United States is. Little defines him more than that. Every time there was a crisis with another nation, one usually created by American militarism—every time!—he would go on TV and massively exaggerate its importance to demonstrate the need for Americans to show toughness and, of course, bomb people. And sure, giving manly campaign speeches on the crisis in Georgia in 2008 that lifted from Wikipedia might have shown his utter intellectual vacuity, but he’s so tough and mavericky!

And then there is his class that one can only love. I mean, who but a true hero to journalists would tell the following joke, as McCain did to Republican funders in the late 1990s:

“Do you know why Chelsea Clinton is so ugly?”
“Because Janet Reno is her father.”

Ha ha ha. What mavericktude! Making fun of the looks of both a teenage girl and a pioneering Cabinet official. This is a good summary of that joke:

“The remark packed into its 15 words several layers of misogyny. It disparaged the looks of Chelsea, then 18 and barely out of high school; it portrayed Reno as a man at a time when she was serving as the first female US attorney general; and it implied that Hillary Clinton was engaged in a lesbian affair while the Monica Lewinsky scandal was blazing. Not bad going, Senator McCain.”

God, what a great American! No wonder we laud him as a hero!

Now, look, McCain wasn’t a legendarily bad senator, particularly in comparison with other early twenty-first century Republicans. On some issues, he did good things. He helped normalize relations with Vietnam and was a critical voice on this issue when it was still sensitive since the diehard POW-MIA people wanted to fight the war forever, determined that evil Asian commies were still holding our boys in torturous cells, imagining Christopher Walken in The Deer Hunter as a daily occurrence in the 1980s. And while his wife’s fortune is what propelled him into office, he did recognize that outside campaign funding was a problem in our political system. McCain-Feingold is not my favorite piece of legislation but it moved the ball toward a goal of better democracy and was probably the best bill that could be passed at the time, or since. Of course, at the same time, he was receiving contributions from the same companies he was supposed to be regulating as head of the Senate Commerce Committee. He wanted to regulate the tobacco industry more heavily, which is hardly controversial, but of course was at the time. So, fine. McCain is not Jesse Helms or James Inhofe or Ted Cruz.

McCain also very badly wanted to be president. So he played up to the Republican base on 90 percent of issues. And of course he was horrible on Bill Clinton. I strongly dislike Bill Clinton for many reasons, but the impeachment proceedings were a direct attack on American democracy, sheer political partisanship for short-term gain. Now, a real political maverick would have noted that even though the president who had been impeached was not a member of my political party, this is a bunch of nonsense that breaks the norms McCain gave lip service to respecting. Naturally, McCain did the opposite and voted for conviction. He followed up this red meat to the Republican base with a book, the sure sign that a politician is thinking about running for president. McCain decided to take on George W. Bush for the Republican nomination in 2000.

Now, even though I am not painting a particularly positive picture of McCain, he was still suspect to the Republican elite because of his very occasional actions working with Democrats. The Straight Talk Express was mostly just pandering to a media that already saw McCain as their Republican daddy who would save us from more Democrats where the men act like women and the women act like men, to borrow from American sage Maureen Dowd. And so, Bush and his allies decided to undercut McCain in the dirtiest way they could get away with.

After McCain won New Hampshire, Karl Rove and his ratfuckers went low on McCain, actually accusing him of fathering a black child out of wedlock, a reference to his adopted daughter from Bangladesh. Of course, this worked like a charm in South Carolina, Strom Thurmond having done this very thing notwithstanding. This was basically the end of the McCain campaign, with Bush winning big among evangelicals, those so-called values voters, who vote in favor of the most racist candidate possible and who love noted moral titan Donald Trump today, just as Baby Jesus would do. McCain won a few more states, but after Super Tuesday, was through.

There was some thought that McCain would have his revenge on Bush, especially after Jim Jeffords left the Republican Party and gave the Senate back to the Democrats. And there was enough of the asshole in McCain to believe this was possible. But in the end, even if Bush and friends had screwed him over personally, McCain is a genuine right-winger and liked basically all of Bush’s policies. So why would he have done this? Plus, he still wanted to be president really bad and that would kill his chances. So more or less, McCain just became a bog-standard Republican again, like he almost always was.

So McCain spent the Bush years cheerleading for the Iraq War except for the torture, which wouldn’t stop him from voting for the war but made for good soundbites to sound mavericky. He said publicly that the U.S. would be greeted as liberators by the Iraqis, which if that ever was true, didn’t last more than a New York minute. His main concern with the Iraq War in the early years was that we didn’t have enough troops there, publicly criticizing Donald Rumsfeld for believing we needed relatively few. And when the war did go disastrously for the United States, McCain was the main force in Congress behind the 2007 troop surge, which had some military effectiveness, but also made McCain completely unable to separate himself from an unpopular and pointless war at the moment he was preparing another run for the White House.

And he spent those years voting for basically every Bush domestic policy proposed. In his free time, he was on TV over and over and over again, maintaining his role as Big Media Hero. For example, the Beltway media loved McCain’s role in the Gang of 14, the bipartisan senators who crafted a compromise allowing Republicans to fill the judiciary with terrible conservative judges. But that was McCain’s game; work with gullible Democrats (of which there were so many in the 2000s) to fashion a policy agreement that allowed someone like Janice Rogers Brown through without a filibuster, all with the end game of then preserving the filibuster for Republicans when Democrats became president, which of course they used to unprecedented extremes. As for the Supreme Court, he said that John Roberts and Sam Alito were “two of the finest justices ever appointed to the United States Supreme Court.” On other policies, again, just a Republican seeking to move resources to the rich. Tax cuts for the rich? You bet!

On the other hand, McCain deserves some credit for not being an anti-immigration extremist. He pressed for comprehensive immigration reform, a project supported by people from George W. Bush to Ted Kennedy, with whom he cosponsored legislation. But there was no way that Republican legislators were going to seriously look to pass this bill and McCain wasn’t going to buck them enough to actually do something about it.

McCain’s 2008 presidential run was hardly predestined for success. An increasingly radicalized Republican base really hated that he wanted a reasonable solution for immigrants that did not deport them. He struggled with fundraising early on. But so did everyone else. The field was a mess, with Republicans deeply unpopular, a sadly impermanent state. Mike Huckabee was a clown who could win in Iowa, but what whackadoodle can’t win over Iowa Republicans? These are people who vote repeatedly for actual Nazi Steve King. But when he beat Mitt Romney in New Hampshire and Huckabee in South Carolina, it was pretty much over. Mr. Noun/Verb/9-11 completely failed and so did Mr. Reverse Mortgage Fred Thompson.

Who did Mr. Maverick announce as his vice-presidential candidate? Why, none other than Sarah Palin! How bipartisan, naming an ignoramus and quasi-fascist whose sole policy objective was making the libs cry! It’s worth noting how much McCain damaged the nation through this choice. Sarah Palin was an irresponsible, incompetent clown. But because she delivered red meat to the base and of course the racists who vote Republican loved it, her handlers and supporters realized that she was the ticket to the future. This wasn’t all on her; it’s not as if George Bush’s naming of Dan Quayle to the ticket in 1988 was all that different, even if he was a generation of wingnut before Palin. But in the aftermath, Palin-esque politicians—more competent and less self-involved—would deliver Republican victories by making white people feel good about expressing their resentments in the most crude way possible. This of course paved the way for Donald Trump, someone even less competent and more self-involved than Palin, but one even better at white supremacy. Thanks John.

In fact, it’s amazing that his entire 2008 campaign didn’t permanently kill his reputation. Even outside of Palin, McCain said nasty thing after nasty thing. Sure, he might have joked about pimping out his wife to bikers at the Sturgis rally, but that’s just boys being boys, amiright? There was his constant invocation of Joe the Plumber, an early rendition of the Trump campaign if there ever was one. McCain repeatedly tried to taint Obama by calling his policies “socialist.” By the end of the campaign, the entire world was relieved. Europeans especially had heard of the reputation of McCain, but ended up calling him John McSame, as they realized he was a very nasty old man who supported nearly all the failed policies of George W. Bush. Moreover, it’s not like McCain softened his positions. He still supported a constitutional amendment against abortion. He either wouldn’t or just couldn’t answer basic questions about sex education and whether contraception stops the spread of HIV. He campaigned on extending the Bush tax cuts through reducing Social Security but wouldn’t specify how that would work. Now that’s the kind of bipartisanship that excites the Beltway! He publicly said he would consult with Sam Brownback on judicial appointments, ensuring that whoever he named would be, well, someone like Neil Gorsuch.

Then there was the time he couldn’t remember how many houses he owned, or should I say his extremely wealthy wife. Turned out the answer was more than 10. Who can keep count! And really, that was a great answer in the middle of a housing crisis. I hadn’t seen a Republican presidential candidate as in touch with everyday people since the time George HW Bush was amazed at a grocery store checkout scanner.

Of course, nothing McCain said was enough for the rabid Republican base, who was dying for someone like Donald Trump. McCain rallies became openly racist and Islamophobic, with Republicans demanding he attack Obama in the most disgusting way possible. McCain did at least resist the worst of this.

Failing pretty miserably in the polls, McCain tried to get Barack Obama to suspend the campaign on September 24 so they could return to the Senate and work on the financial crisis. Not being an complete idiot, Obama refused. And as November rolled around, it was clear that McCain would get blown out of the water, which he did.

And here’s the rub—nothing about McCain’s lionization by the media before, during, or after his presidential campaign made any sense at all. With very few exceptions, he was just a standard right-wing Republican. He was nothing but a Goldwater follower who cared about foreign policy a lot. Basically, McCain’s relationship to the media comes out of their deep desire for a Republican Daddy who will protect their financial assets, make America look real tough on the international stage, provide lip service to international standards of behavior, and at least not sound like a maniac on social policy, despite his actual voting record. This is the ideal for our Beltway media and it’s disgusting.

McCain returned to the Senate in 2009 and played the exact same role as he had before. It’s amazing that he wasn’t subjected to the rule that losing presidential candidates must disappear after the election. Or wait, is that law only applied to female presidential candidates? Hmmm… Anyway, McCain remained the same bog standard hack he always was, talking about pork in the federal budget by bringing up hi-larious issues such as the government funding beaver management programs—as if that wasn’t an actual issue land managers face. There was a brief moment when he and Obama had a good relationship, but that ended as soon as Mitch McConnell decided the Republican strategy to losing power would be to destroy as many norms of American politics in the most cynical method he could. McCain joined this with aplomb, despite his mysteriously rehabbed reputation with the media as a bipartisan leader, having that reputation seriously damaged for about a month at the end of the election. McCain ripped Obama for pulling out of plans to build a missile defense complex in Poland that was unnecessary. Despite his previous support for doing something about climate change, he now refused to engage in any constructive legislation to address it. Not with Obama in the White House he wouldn’t! He led the filibuster to stop the repeal of “Don’t Ask, Don’t Tell.” When it finally was repealed, he said that it was “a very sad day” that would undermine the military. And it’s true, how has the U.S. military functioned since what with all the gay sex? McCain hated the Affordable Care Act when it was passed, regardless of his later vote to save it. He once sponsored the DREAM Act, but now voted against it. I could go on. If McCain had been a maverick before, which he hadn’t, he was a full-fledged hack now.

Yes, McCain had some issues where he had respectable standards of decency. He consistently opposed torture, but then did absolutely nothing to object to pro-torture politicians outside of this narrow zone. He might vote against a particular nominee who had been directly involved in torture, but then would go on talk show after talk show defending the people who put said person there and the policies that led to the torture and would lead to more. The McCain-Feingold campaign finance bill was a good one, but again, once that began to be chipped away, McCain did nothing but support the very people responsible for it. After Benghazi, McCain was on the front lines accusing Hillary Clinton of awful things that she was not responsible for, calling it worse than Watergate and ensuring Susan Rice not succeed Hillary as Secretary of State. Of course, McCain was all about intensifying the war in Syria with the massive army of our supposed allies. What could have gone wrong! He would occasionally return to some bipartisan actions, such as his support for comprehensive immigration reform, but in the end, he almost always put the Republican Party over the nation’s needs. When he could have really stood up against Donald Trump, a man who had directly insulted him, he did not. He voted for the judges, voted for Jefferson Beauregard Sessions III, voted for almost the entire Trump/Ryan/McConnell policy agenda.

Overall, the man had very few principles that trumped his extreme partisanship. Take for example his position on Supreme Court justices. For much of his career, he voted for whoever a president nominated, whether it was Robert Bork or Ruth Bader Ginsburg. If you believe a president has the right to name whoever they want to the Court, then live by it. OK. But at the end, when control over the Court was in the balance and Mitch McConnell was willing to destroy two centuries of norms in order to advance his radical right-wing agenda, McCain completely changed course! First, he voted against Sonia Sotomayor. Then there was no way he would vote on Merrick Garland. And after he helped McConnell prevent Obama from filling that sea, he stated that Republicans would block all Supreme Court nominations Hillary Clinton would have made, saying “I promise you that we will be united against any Supreme Court nominee that Hillary Clinton, if she were president, would put up.” Now that’s some independent bipartisan leadership! And look, I am more than happy to give McCain a bit of credit for voting against the repeal of the ACA in 2017. He certainly doesn’t deserve more credit than anyone else who voted against the bill, but fine. Good for you. For once you were no less terrible than the worst Democrat in the Senate.

On very rare occasions, particularly in the 2008-10 period, reporters would realize how awful McCain actually was, write a column claiming they were reconsidering the man, and then go back to lauding him soon after. Even before he died, McCain became the object by which reporters could pursue their wet dreams in obituaries. Dana Milbank’s may be the most sycophantic, but this exchange between Bret Stephens and Gail Collins really isn’t any better. CNN decided it was time to publish articles by Asian-Americans forgiving McCain for his grotesque racism; after all, what is the feelings of marginalization due to racial discrimination compared to the veneration of War Hero Maverick McCain?

Who will our lovely media turn their desperate attention to now? Is Michael Bloomberg the only man who can save us? Is Lindsey Graham the Republican Daddy we need? A generation of Meet the Press appearances demands to know!

John McCain is survived by, among others, his wife Cindy and his daughter Meghan, who has recently spent her time hyperventilating about inheritance taxes and marrying an actual fascist.

FacebookTwitterGoogle+Share

Read the whole story
nocko
86 days ago
reply
Share this story
Delete
1 public comment
fxer
87 days ago
reply
"he was shot down over North Vietnam and nearly died, first from his injuries and landing in water, and then from being beaten after the villagers rescued him. After all, he was raining fire on them and killing them left and right. The reaction of the villagers is entirely understandable."
Bend, Oregon

Porting Coreboot to the 51NB X210

1 Share
The X210 is a strange machine. A set of Chinese enthusiasts developed a series of motherboards that slot into old Thinkpad chassis, providing significantly more up to date hardware. The X210 has a Kabylake CPU, supports up to 32GB of RAM, has an NVMe-capable M.2 slot and has eDP support - and it fits into an X200 or X201 chassis, which means it also comes with a classic Thinkpad keyboard . We ordered some from a Facebook page (a process that involved wiring a large chunk of money to a Chinese bank which wasn't at all stressful), and a couple of weeks later they arrived. Once I'd put mine together I had a quad-core i7-8550U with 16GB of RAM, a 512GB NVMe drive and a 1920x1200 display. I'd transplanted over the drive from my XPS13, so I was running stock Fedora for most of this development process.

The other fun thing about it is that none of the firmware flashing protection is enabled, including Intel Boot Guard. This means running a custom firmware image is possible, and what would a ridiculous custom Thinkpad be without ridiculous custom firmware? A shadow of its potential, that's what. So, I read the Coreboot[1] motherboard porting guide and set to.

My life was made a great deal easier by the existence of a port for the Purism Librem 13v2. This is a Skylake system, and Skylake and Kabylake are very similar platforms. So, the first job was to just copy that into a new directory and start from there. The first step was to update the Inteltool utility so it understood the chipset - this commit shows what was necessary there. It's mostly just adding new PCI IDs, but it also needed some adjustment to account for the GPIO allocation being different on mobile parts when compared to desktop ones. One thing that bit me - Inteltool relies on being able to mmap() arbitrary bits of physical address space, and the kernel doesn't allow that if CONFIG_STRICT_DEVMEM is enabled. I had to disable that first.

The GPIO pins got dropped into gpio.h. I ended up just pushing the raw values into there rather than parsing them back into more semantically meaningful definitions, partly because I don't understand what these things do that well and largely because I'm lazy. Once that was done, on to the next step.

High Definition Audio devices (or HDA) have a standard interface, but the codecs attached to the HDA device vary - both in terms of their own configuration, and in terms of dealing with how the board designer may have laid things out. Thankfully the existing configuration could be copied from /sys/class/sound/card0/hwC0D0/init_pin_configs[2] and then hda_verb.h could be updated.

One more piece of hardware-specific configuration is the Video BIOS Table, or VBT. This contains information used by the graphics drivers (firmware or OS-level) to configure the display correctly, and again is somewhat system-specific. This can be grabbed from /sys/kernel/debug/dri/0/i915_vbt.

A lot of the remaining platform-specific configuration has been split out into board-specific config files. and this also needed updating. Most stuff was the same, but I confirmed the GPE and genx_dec register values by using Inteltool to dump them from the vendor system and copy them over. lspci -t gave me the bus topology and told me which PCIe root ports were in use, and lsusb -t gave me port numbers for USB. That let me update the root port and USB tables.

The final code update required was to tell the OS how to communicate with the embedded controller. Various ACPI functions are actually handled by this autonomous device, but it's still necessary for the OS to know how to obtain information from it. This involves writing some ACPI code, but that's largely a matter of cutting and pasting from the vendor firmware - the EC layout depends on the EC firmware rather than the system firmware, and we weren't planning on changing the EC firmware in any way. Using ifdtool told me that the vendor firmware image wasn't using the EC region of the flash, so my assumption was that the EC had its own firmware stored somewhere else. I was ready to flash.

The first attempt involved isis' machine, using their Beaglebone Black as a flashing device - the lack of protection in the firmware meant we ought to be able to get away with using flashrom directly on the host SPI controller, but using an external flasher meant we stood a better chance of being able to recover if something went wrong. We flashed, plugged in the power and… nothing. Literally. The power LED didn't turn on. The machine was very, very dead.

Things like managing battery charging and status indicators are up to the EC, and the complete absence of anything going on here meant that the EC wasn't running. The most likely reason for that was that the system flash did contain the EC's firmware even though the descriptor said it didn't, and now the system was very unhappy. Worse, the flash wouldn't speak to us any more - the power supply from the Beaglebone to the flash chip was sufficient to power up the EC, and the EC was then holding onto the SPI bus desperately trying to read its firmware. Bother. This was made rather more embarrassing because isis had explicitly raised concern about flashing an image that didn't contain any EC firmware, and now I'd killed their laptop.

After some digging I was able to find EC firmware for a related 51NB system, and looking at that gave me a bunch of strings that seemed reasonably identifiable. Looking at the original vendor ROM showed very similar code located at offset 0x00200000 into the image, so I added a small tool to inject the EC firmware (basing it on an existing tool that does something similar for the EC in some HP laptops). I now had an image that I was reasonably confident would get further, but we couldn't flash it. Next step seemed like it was going to involve desoldering the flash from the board, which is a colossal pain. Time to sleep on the problem.

The next morning we were able to borrow a Dediprog SPI flasher. These are much faster than doing SPI over GPIO lines, and also support running the flash at different voltage. At 3.5V the behaviour was the same as we'd seen the previous night - nothing. According to the datasheet, the flash required at least 2.7V to run, but flashrom listed 1.8V as the next lower voltage so we tried. And, amazingly, it worked - not reliably, but sufficiently. Our hypothesis is that the chip is marginally able to run at that voltage, but that the EC isn't - we were no longer powering the EC up, so could communicated with the flash. After a couple of attempts we were able to write enough that we had EC firmware on there, at which point we could shift back to flashing at 3.5V because the EC was leaving the flash alone.

So, we flashed again. And, amazingly, we ended up staring at a UEFI shell prompt[3]. USB wasn't working, and nor was the onboard keyboard, but we had graphics and were executing actual firmware code. I was able to get USB working fairly quickly - it turns out that Linux numbers USB ports from 1 and the FSP numbers them from 0, and fixing that up gave us working USB. We were able to boot Linux! Except there were a whole bunch of errors complaining about EC timeouts, and also we only had half the RAM we should.

After some discussion on the Coreboot IRC channel, we figured out the RAM issue - the Librem13 only has one DIMM slot. The FSP expects to be given a set of i2c addresses to probe, one for each DIMM socket. It is then able to read back the DIMM configuration and configure the memory controller appropriately. Running i2cdetect against the system SMBus gave us a range of devices, including one at 0x50 and one at 0x52. The detected DIMM was at 0x50, which made 0x52 seem like a reasonable bet - and grepping the tree showed that several other systems used 0x52 as the address for their second socket. Adding that to the list of addresses and passing it to the FSP gave us all our RAM.

So, now we just had to deal with the EC. One thing we noticed was that if we flashed the vendor firmware, ran it, flashed Coreboot and then rebooted without cutting the power, the EC worked. This strongly suggested that there was some setup code happening in the vendor firmware that configured the EC appropriately, and if we duplicated that it would probably work. Unfortunately, figuring out what that code was was difficult. I ended up dumping the PCI device configuration for the vendor firmware and for Coreboot in case that would give us any clues, but the only thing that seemed relevant at all was that the LPC controller was configured to pass io ports 0x4e and 0x4f to the LPC bus with the vendor firmware, but not with Coreboot. Unfortunately the EC was supposed to be listening on 0x62 and 0x66, so this wasn't the problem.

I ended up solving this by using UEFITool to extract all the code from the vendor firmware, and then disassembled every object and grepped them for port io. x86 systems have two separate io buses - memory and port IO. Port IO is well suited to simple devices that don't need a lot of bandwidth, and the EC is definitely one of these - there's no way to talk to it other than using port IO, so any configuration was almost certainly happening that way. I found a whole bunch of stuff that touched the EC, but was clearly depending on it already having been enabled. I found a wide range of cases where port IO was being used for early PCI configuration. And, finally, I found some code that reconfigured the LPC bridge to route 0x4e and 0x4f to the LPC bus (explaining the configuration change I'd seen earlier), and then wrote a bunch of values to those addresses. I mimicked those, and suddenly the EC started responding.

It turns out that the writes that made this work weren't terribly magic. PCs used to have a SuperIO chip that provided most of the legacy port functionality, including the floppy drive controller and parallel and serial ports. Individual components (called logical devices, or LDNs) could be enabled and disabled using a sequence of writes that was fairly consistent between vendors. Someone on the Coreboot IRC channel recognised that the writes that enabled the EC were simply using that protocol to enable a series of LDNs, which apparently correspond to things like "Working EC" and "Working keyboard". And with that, we were done.

Coreboot doesn't currently have ACPI support for the latest Intel graphics chipsets, so right now my image doesn't have working backlight control.Backlight control also turned out to be interesting. Most modern Intel systems handle the backlight via registers in the GPU, but the X210 uses the embedded controller (possibly because it supports both LVDS and eDP panels). This means that adding a simple display stub is sufficient - all we have to do on a backlight set request is store the value in the EC, and it does the rest.

Other than that, everything seems to work (although there's probably a bunch of power management optimisation to do). I started this process knowing almost nothing about Coreboot, but thanks to the help of people on IRC I was able to get things working in about two days of work[4] and now have firmware that's about as custom as my laptop.

[1] Why not Libreboot? Because modern Intel SoCs haven't had their memory initialisation code reverse engineered, so the only way to boot them is to use the proprietary Intel Firmware Support Package.
[2] Card 0, device 0
[3] After a few false starts - it turns out that the initial memory training can take a surprisingly long time, and we kept giving up before that had happened
[4] Spread over 5 or so days of real time

comment count unavailable comments
Read the whole story
nocko
112 days ago
reply
Share this story
Delete