7 stories
·
0 followers

Nearly 6 in 10 Canadians call lack of new pipeline capacity a 'crisis,' poll suggests

1 Comment and 2 Shares
KINDER MORGAN CN-PIPELINE/

A slight majority of Canadians are calling the lack of new oil pipeline capacity in the country a crisis, according to findings from a recent survey by the Angus Reid Institute.

Read the whole story
nocko
17 hours ago
reply
Crisis? Old people with land lines bad at estimating risk... Film at eleven.
Share this story
Delete

Listening to Estrogen

3 Shares

Photo-Illustration: Artwork by CUR3ES

When she was 45, at around the same time her menstrual periods became irregular, Janet developed an obsession with a man at work. Now 61, Janet has the same job she had back then, managing the chemistry-department stockroom at a small northeastern college. But at the time, she was studying for a master’s degree and living in a suburb with her husband and teenage daughter, and Jim was a new addition to the chemistry faculty. He had a quiet, sensitive nervousness that appealed to Janet, and she felt from the outset that they had a bond.

The first time Janet began to feel seriously off was in the spring of 2001, when she was taking Jim’s inorganic chemistry class. Sitting in the lecture hall, listening to Jim talk about metals and compounds, Janet would feel a pain in her head, on the left side, slightly above and behind her ear — she indicates the area as if she were brushing away a fly — “like something was trying to get out of my head. And I knew it wasn’t right. Like a tumor. The weird thing was I would rub it and go, ‘Oh, I wish that would go away,’ and it would go away for about two minutes and then it would come back again. I’d say, ‘Well, a tumor wouldn’t go away if you rubbed it. So it’s not a tumor.’ ”

Janet first heard the voice — male, about 30 years old — while she was out with Jim and a group of co-workers at a TGI Fridays near campus. She was gazing at Jim (whose name, like some others’ in this story, has been changed) in the bar and thinking about how nice it would be to put her head on his shoulder. The voice said, “Go ahead!” It wanted her to snuggle up to him. She didn’t. She knew that would be ridiculous.

But the voice — which Janet decided sounded like Jim’s — persisted.
Within a couple of months, Janet was conversing regularly with it, a telepathic communication she started to imagine was emanating straight from Jim’s brain. “I thought he really loved me and cared about me but couldn’t say anything out loud,” she says. “The voice was talking to me just like it was Jim. It might as well have been Jim, as far as I was concerned.” So real did Janet’s conversations seem that she went to Jim’s house one evening to confront him about his feelings for her. He refused to answer the door, though she could see through the window that he was home.

The voice became Janet’s best company. It was Jim, but it also became more than Jim. She says it sounded exactly like him, but maybe it was God or Jesus or a spiritual guide. Speaking now, Janet sometimes uses the singular, voice, and more often the plural, voices. Together with the voices, she would go on adventures. Janet lives in a small town surrounded by forest and winding highways, and one evening when she was taking a walk she saw glowing orbs floating before her, about the size of grapefruits and at waist height; the voices told her that if she danced between them in a specific order, she could save the world. “So I danced all the way down, about a mile, and all the way back, and then I came home and danced in my apartment, all around the apartment, some more.”

As the months went on, Janet started to feel scared. The voices could be threatening. They always made a big deal out of little things. They yelled at her. During one of our conversations, we are sitting together in the chemical stockroom, a dungeonlike space with concrete floors and walls that is strewn with fast-food wrappers, old textbooks, and barrels of solvents. Janet is perched on the very edge of her swivel chair, leaning forward, and she points to my sunglasses, which are facedown on a long table. The voices, she says, were frequently harsh and critical about unimportant things. If they saw the sunglasses facedown, for example, “they would say, ‘Oh my God, you’ve got to move the glasses! They’re on the table the wrong way! It’s an emergency!’ ”

Janet realized she was barely functioning. She was getting herself to work every day, but she wasn’t eating and couldn’t focus. “Filing was almost impossible,” she says. Her marriage had fallen apart. Her daughter, busy with high school and alarmed at her mother’s strange behavior, went to live with her father. One day, overcome, Janet left work with the idea that, to escape the voices, she would drive her car into a tree. She wrecked the car but was physically unharmed, and when a police officer came to the scene, “I was kind of honest with him, and he got me an ambulance.” At the hospital, Janet was given her first dose of Risperdal, an antipsychotic delivered in pink or orange pills, which quiets the voices but doesn’t silence them completely. Janet had never had a serious mental illness before, and she has no family history of mental illness. She believes her breakdown was activated by Jim’s arrival in her life and by the vulnerable moment at which it happened. Janet was perimenopausal, the period before menopause when a woman’s estrogen levels begin to fluctuate wildly. And she was “very upset.”

Photo-Illustration: Joe Darrow for New York Magazine

Schizophrenia has always been regarded as a disorder of the young — a catastrophic unraveling of the mind in late adolescence or early adulthood. It presents most calamitously among young men, who are frequently diagnosed in their teenage years; for women, diagnosis comes years later. A patient will show up in the emergency room after a period of disorganized behavior, not having eaten or washed for weeks, often speaking incoherently, and presenting with what the DSM 5 calls “positive” symptoms — hallucinations, delusions — as well as some “negative” ones, such as lethargy. Patients may not be able to describe what’s happening to them or even perceive the extent to which their thoughts have become unhinged, but family members do. Schizophrenia is a chronic illness, they learn. There is no cure, only therapies and drugs that can, in some cases, control the symptoms to a manageable degree. Even with treatment, recurrence is a near certainty. And the lifetime prognosis is bad. Life expectancy for people with schizophrenia is ten to 25 years shorter than for those in the general population. Death by suicide is 13 times more likely.

Youth has been a diagnostic criterion for schizophrenia for a hundred years, including within the pages of the DSM, where schizophrenia has sometimes included an age limit: As recently as the 1980s, a person could not be tagged schizophrenic if he or she was older than 40. Some clinics targeting early intervention have cutoff ages as young as 24.

But schizophrenia does not neatly comply with that simplistic understanding. In the early 1990s, three British psychiatrists, curious about why men with schizophrenia had their first psychotic episode so much earlier than women, took a look at the voluminous diagnostic records in doctors’ offices and hospitals in one populous London neighborhood covering a period of 20 years. They found something astonishing: a demonstrable “second peak” of first-onset schizophrenia after 45. These patients were predominantly female.

These older patients compose just a fraction of the total number. About one percent of people worldwide receive a schizophrenia diagnosis, and almost 20 percent of them are diagnosed for the first time after the age of 45. But the data suggested a deeply embedded bias in the way doctors had thought about schizophrenia for a century, overlooking the middle-aged women who came to them with psychotic symptoms, refusing to believe they could have schizophrenia because the official classifications, and medical tradition, excluded them. In their view, “madness” associated with “the change of life” was not madness at all — not a serious affliction to be taken seriously — but a women’s malady to be treated with bleeding and leeches, herbs and ointments, drugs, alcohol, and the desiccated and powdered ovaries of farm animals. Committed to American asylums in the late-19th century, women with mysterious symptoms were labeled “insane from suppressed menses.” And a whole ecosystem of diagnosis and treatment failed to grow.

There is, to be sure, genuine tragedy in lost human potential at a young age. But it is also tragic for a woman to become mentally ill in the middle of her life, at a time when she has, if she’s been lucky, built a universe — a family, a job, friendships, a network of responsibilities and dependencies erected on the assumption of stability. She might have adolescent children and aging parents, professional duties and bills to pay. She might have a classroom of students; she might be the mortgage broker helping a family keep ownership of their home or the doctor advising on a chemotherapy plan.

A psychiatrist in Boston told me about a patient, a devout Christian with no previous history of mental illness, who suddenly believed a devil was occupying her teenage son and she needed to exorcise that spirit with a kitchen knife. The terrorized son called 911 and accompanied his mother to the ER. Another doctor told me about a patient so disconnected and paranoid that, while insisting on her own mental health, she had alienated every single person who might have helped her, including her parents and her husband, who wanted a divorce but instead quit his job to stay at home — so afraid was he of what she would do if he left their two young children with her unattended. A health-care worker in Northern California told me about becoming psychotic, catatonic, and suicidal as she approached menopause. At the time, she lived alone with her youngest daughter, who was 13. “Basically, quite a few times, she saved my life by calling 911. I would take an overdose and then I’d go to the hospital. Then they’d send me home and my poor child was frightened because of all she had to deal with. She’s still recovering. She’s 30.”

It has now been more than 20 years since the official discovery of that second peak, but the fact of it remains almost entirely invisible within the psychiatric Establishment and even more so among general doctors — and thus unavailable as an explanation to women experiencing their first breakdowns later in life and ignored as a guide for how to best treat them. Instead, women who become suddenly psychotic in middle age are typically given diagnoses that indicate an attending doctor’s baffled shrug (“psychosis not otherwise specified” was Janet’s diagnosis) and are treated with powerful antipsychotics that were long tested in clinical trials on men. These women, unclassified and living at the margins of health, are alienated further by medicine’s bewilderment at them.

But over those same 20 years, a tiny group of mainly female psychiatrists working independently all over the world, from inside American universities and organizations like the National Institute of Mental Health to researchers working as far away as Switzerland and Spain, began to study these women. They believe that in some, the dramatic fluctuations in hormones that accompany the onset of menopause may help to trigger schizophrenia. This correlation is called “the estrogen hypothesis.”

This hypothesis is powerful in that it helps to explain a large number of divergent cases. There are those second-peak schizophrenic women, who have breakdowns similar to those more commonly recognized in the young. But the circle of women whose lives and suffering may be illuminated by the estrogen hypothesis is much wider — women with psychosis linked to fluctuations in estrogen at all phases of life. There are women with traditional, early-onset schizophrenia whose hallucinations become more or less vivid in concert with their hormonal cycles, and women who become suddenly and dramatically psychotic — or catatonic or suicidal — when their estrogen plummets in the weeks after childbirth. Beyond that, there are women with previously diagnosed and controlled mental illnesses (bipolar disorder, major depression) who experience a recurrence or exacerbation during the menopause transition, as well as women who feel suicidal when others have PMS.

All women understand themselves to be in constant conversation with their hormones, which they know can wreak havoc with their mood. But they can also bolster and stabilize it. Yet this insight has been boxed away as somehow irrelevant to diagnosis and treatment by doctors unwilling to consider the possibility that the models of illness extrapolated from the experiences of men apply incompletely to women. “If men can’t get an erection, it’s a natural disaster,” says Catherine Birndorf, a reproductive psychiatrist who recently left Cornell to start a freestanding outpatient center in New York City for women with postpartum and perinatal mental illness. “But very little is studied in women because of our reproductive capacity and because of patriarchy. We need to try to understand women better.”

From his perch as the head of psychiatry at Columbia University, Jeffrey Lieberman puts it more bluntly: “Medicine hasn’t paid much attention to these women. Middle-aged women are low priority, like children used to be.”

For her psychiatric residency at Columbia in the early 1960s, Mary Seeman worked at an all-women’s psychiatric ward of Manhattan State Hospital. She was struck by how different female patients with schizophrenia looked and acted compared with what she’d read about in her textbooks. The rap on schizophrenia at the time was that it made people disconnected, but Seeman found her female patients to be just the opposite. They were lively, engaged, and responsive, able to converse with each other and with her in meaningful ways — even when they were medicated with antipsychotics, which frequently make people affectless.

When Seeman moved home to Canada, she found the same thing. In Toronto, she led group-therapy sessions for male and female patients with schizophrenia. In the men’s group, patients sat like lumps, not making eye contact and failing to remember one another’s names. “They seemed on the surface not to have feelings. They were unresponsive, wanting to be alone,” she says. But the women asked about each other’s children, shared knitting patterns, and made dates for coffee. Many of the female patients were diagnosed with something other than schizophrenia because their symptoms looked so different.

Other clinical gender differences struck Seeman too. As the textbooks predicted, men with schizophrenia seemed to improve as they got older, needing smaller doses of antipsychotic medication to control their symptoms. Young women with regular estrogen cycles needed less medicine than young men, but around the time of menopause, they started to fare worse. She remembers one patient in particular, a woman who always had a job and a boyfriend — a steadiness that is very unusual among people with a schizophrenia diagnosis. “She looked smart, was well dressed and never in the hospital,” Seeman says. “Then, roughly around the time of menopause, something remarkable happened. She was no longer working; she broke up with her boyfriend and became suicidal. Once she brought this difference to my attention, I began to notice that she wasn’t alone.”

No one knows exactly how schizophrenia starts — or, for that matter, what it is. Genes definitely play a role. People with a first cousin who has schizophrenia are more than twice as likely to receive the diagnosis than the population at large, and for those with a parent diagnosed, there’s a 15-fold risk. But environment interacts with the genes to turn them on in certain cases, and researchers have sought to pinpoint those triggers — to determine which circumstances or behaviors put a person who is predisposed at increased risk. Pot smoking in adolescence is thought to be a risk factor; so are head injuries during birth and in early childhood. In utero infections may play a role: Psychiatrists talk about the “spring birth” risk, by which they mean that fetuses can contract the viruses that thrive during the winter months in some parts of the world, increasing their susceptibility to schizophrenia later on. These are all hypotheses with some data to support them. The truth is, for all the codifications of the DSM, science still has an extremely rudimentary and tentative understanding of what actually happens in the brains of people we recognize as mentally ill. And political institutions routinely fail to meaningfully fund that inquiry. The estrogen hypothesis is valuable not because it provides a clear-cut answer as to why some women mentally deteriorate in midlife but because it suggests one way, in the face of a terrifying mystery, to begin exploring it.

Mary Seeman is 83 years old, with cropped white hair and a sharp, wise face. She still lives in Toronto, her sixth-floor apartment a jungle gym of rails and poles to accommodate her husband, Philip, who is 84 and has a degenerative muscle disease. Seeman was one of the first psychiatrists to put forward the estrogen hypothesis, an idea inspired, she told me, by Philip.

While Mary was busy training to be a psychiatrist, Philip, a psychopharmocologist, was trying to figure out how antipsychotic drugs worked on the brain. The first antipsychotic, Thorazine, came on the market in 1954. It had the miraculous effect of quieting psych wards overnight, as it switched off the hallucinations and delusions that made patients babble and scream, enabling them to become functional and coherent again. It was also a puzzle. Designed as a companion drug to anesthesia, no one, including its manufacturers, understood why Thorazine also quieted psychosis. Philip applied himself to finding out.

In 1974, he published an extraordinary discovery: All antipsychotic drugs work the same way. Psychosis likely results from excessive levels of dopamine — the brain chemical that gives feelings of pleasure and a “high” — possibly from faulty receptors that allow a flooding to occur. The antipsychotics block the dopamine receptors, creating a barrier like a lid over a pot, preventing the receptors from “encountering” the dopamine at all. (This is why people who take antipsychotic drugs say their emotions feel so muted; the medicine interrupts their pleasure mechanism.) In the Seemans’ apartment, Philip slowly abandons his walker and pulls up a chair. He brightens. Taking a pen from his pocket, he leans over the coffee table between us and starts to draw on a pad, making a big triangle. That’s the key dopamine receptor — D2, as it’s called. Then he draws an arrow intercepting and blunting the triangle’s sharp point. That’s the antipsychotic drug creating a dam against dopamine.

He is disabled and frail, but Philip’s face begins to shine as he gets to his point. He draws another arrow intercepting his triangle from a different direction. This is estrogen. It also blunts the effect of dopamine and acts as an inhibitor, similar to an antipsychotic. “And that’s really why women don’t start wars,” he says.

“I think women have started wars,” Mary says, laughing.

“Not so much,” says Philip.

The Seemans’ discoveries were symbiotic. If estrogen modulates psychosis, it might explain why schizophrenic symptoms in menstruating women were less severe than those in men and why these women needed lower doses of antipsychotics to control them. It might even be protective enough to delay onset for a number of years. Sudden, dramatic fluctuations in estrogen during perimenopause, the months or years before a woman stops menstruating, might explain why a woman with no previous history of mental illness might suddenly come down with a bad case of psychosis. And the absence of estrogen after menopause might explain why a woman’s psychotic symptoms could suddenly resemble those of a very young man.

In 1981, Mary Seeman published one of the first papers suggesting the estrogen hypothesis in a small journal connected with the University of Ottawa. She was the only author, making a mild plea for her psychiatry colleagues to pay more attention to the role of estrogen in schizophrenia. The plea fell on deaf ears. Even now, the study of estrogen on psychosis is so diffuse and interdisciplinary that “I can’t really say it’s a field,” she told me.

In 1988, a young psychiatrist named Laura Miller was tasked with opening one of the country’s first women’s reproductive-mental-health clinics at the University of Illinois at Chicago after a patient at a local state psychiatric hospital, who had been delusionally denying her pregnancy, accidentally drowned her newborn in the toilet. There was no established discipline in reproductive mental health at the time, no association for practitioners with specialized expertise. “I contacted everyone who had written a paper on the subject. We were self-taught and formed a coalition. There were about a dozen of us,” Miller says.

Researchers around the world began to explore the connections between psychosis and estrogen at every phase of a woman’s life. They found that early puberty seems to correlate to later onset of schizophrenia, positing that the presence of estrogen delays its debut somehow. They found that in women with preexisting psychotic disorders, severity of symptoms declined during pregnancy — when estrogen surges in a woman’s body — and then rose again after birth when estrogen tanks. Among women of childbearing age with an existing schizophrenia diagnosis, a large minority reported an increased severity of symptoms just before menstruation, when estrogen dips.

In 2013, premenstrual dysphoric disorder became an official diagnosis in the new revision of the DSM, an acknowledgment that for 5.5 percent of women, the phase usually known as PMS can be debilitating, contributing to severe depression, lost days of work, dangerous ruptures in relationships, and even suicide. Older feminists opposed the classification, arguing that it made a pathology of being female, but younger feminists disagreed. By establishing the category, they said, a group of sufferers could finally be acknowledged and receive the treatment they require (and have insurance cover that treatment).

But the medical Establishment still hasn’t really caught on. The DSM hardly mentions reproductive hormones, and the doctors who look to hormonal changes for answers or causes remain a tiny minority. Only 59 percent of psychiatry residencies require any training in reproductive psychiatry at all, and far fewer hold residents to a standard of competency. In the meantime, people like Talia continue to suffer. Talia had been diagnosed with bipolar disorder in her 20s but had it under control with medication. Then around Christmas 2016, just after the election of Donald Trump, when she was 42 and beginning to get hot flashes that forced her to keep her bedroom window open even on frigid nights, Talia came down with a terrible cold and cough that led to ten days of insomnia. She became exhausted and told her regular psychiatrist that she was “delirious.” She felt so disconnected from herself.

At the large midwestern university where she is a professor, Talia had been studying the Nazi program Aktion T4, the wholesale rounding up and extermination of people who were disabled, elderly, or mentally ill, and now she began to fear that she was a Nazi target. It was the worst kind of paranoia, she says, because it was based in her own extensive research. She knew everything about Aktion T4. “I knew how the paperwork was done for Aktion T4. I knew how the bus system worked,” she tells me as we sit one afternoon at her dining-room table. Pale and wide-eyed, she comes across as delicate, like a person who has survived a wreck or a trauma, which she has. “I fell off a cliff,” is how she puts it. The election of Trump amplified her fears. Her rational mind clearly saw the historical analogies between the nationalistic right of 2016 and the Nazi Party. Her irrational mind turned an academic observation into a full-blown reality. To stop Nazi doctors from spying on her, Talia covered the windows in her front hall with white computer paper and kept the blinds in the living room permanently closed. One afternoon when she was home alone, there was a knock at the door and Talia hid in the closet, so sure was she that the Nazis had come to take her away.

It was around this time that Talia started pacing the kitchen ceaselessly, unable to speak. She plastered every inch of the dining table — the same one where we sit with our coffee mugs — with Post-it reminders of things to do. She couldn’t shop or cook or drive or read. “I remember trying to read the directions on a muffin box and it was bewildering to me,” she says. She was able to communicate with her husband, Ted, only by scribbling notes, and Ted, a professor at the same university, was forced to tell Talia’s department chair that she was unwell; he then took a leave from his job to care for her and their daughter. Ted remembers wondering throughout that awful time whether this was the new normal, with his wife mute and the shades always drawn. “In my mind, I’m running a thousand miles a minute,” he says. “So now we’re just going to be writing everything to each other? It’s ridiculous, but I start thinking, If I can accept this, then we’re good.

In the hospital, Talia received a diagnosis of schizoaffective disorder, which is, as she explains to me, as if “schizophrenia and bipolar had a baby.” She spent two weeks there, and longer working out a medication regimen that controls her psychosis but doesn’t make her feel, as she puts it, “blunted.” She now takes Vraylar, an antipsychotic; Lamictal, a mood stabilizer; and, to help with her cognitive impairments, an Alzheimer’s drug called Galantamine. For the first time in two years, she feels like herself, although she still can’t read as she used to: Abstract material is difficult and she doesn’t retain information as easily, a loss that infuriates her. Talia’s periods are irregular  “I’m bleeding on and off all the time,” she says — possibly a symptom of perimenopause or a side effect of the medicine she takes. Her psychiatrist has tried to reassure her that the approach of menopause will not necessarily exacerbate her symptoms, because this does not happen in every case. “I’ve kind of resigned myself to the fact that it probably will recur,” she tells me. “You know, I haven’t had a long enough period of being really symptom free to get comfortable. I’m just kind of like always waiting.”

How might Talia’s — or Janet’s — lives have been different? Imagine a world in which medicine and science really prioritized women, addressing and inquiring into our femaleness, including our menstrual cycles and hormone fluctuations, rather than shoehorning us into categories established by men, substantiated by research on men, and designed to diagnose and treat men? What if the experts who create the disease taxonomies in the DSM established tags and labels for the collections of severe symptoms that can occasionally accompany fluctuations in women’s hormonal cycles so that any internist, anywhere, could flip the book open, make the connection, and know what to do? What if practitioners understood how female hormones impaired or enhanced the effectiveness of medicine? What if women with midlife psychosis formed a recognizable cohort and saw one another as regular travelers through an unmarked and treacherous territory? Imagine if Janet and Talia were visible to one another, and to us.

Lieberman at Columbia concedes that the treatment so often received by outlier patients “doesn’t speak well for the standard of care,” but he points out how many different groups fail to attract meaningful attention from medicine — including women, minorities, most of the mentally ill. Older women are deemed to be “over the hill, so why bother?”

In 2009, an Australian psychiatrist named Jayashri Kulkarni began publishing the results of extraordinary experiments that took the estrogen hypothesis to the next step. If fluctuations in estrogen exacerbated psychosis, then shouldn’t infusions of estrogen — supplemental hormones — regulate and ameliorate it? In a small trial, Kulkarni administered estrogen along with regularly prescribed antipsychotic medication to a group of women of childbearing age diagnosed with schizophrenia; she found that, compared with a control group, their positive and negative symptoms abated. She tried again with a larger group and got a similar result. She tried with a tiny group of men — administering estrogen together with antipsychotics for one week to avoid the hormone’s feminizing side effects (such as the growth of breasts) — and found the same. By day five, the male estrogen group “showed significant abatement of psychotic symptoms compared to the placebo group,” and by day seven, the men had improved even more.

Cognizant of estrogen’s risks — in 2002, lead researchers had interrupted the massive Women’s Health Initiative study on hormone-replacement therapy after it showed a dramatic increase in breast-cancer incidence and heart-disease risk — Kulkarni then did another series of experiments, this time on menopausal women with schizophrenia, using drugs that simulate estrogen, which carry fewer consequential risks. Here, too, she found “a robust therapeutic effect.”

It was dinnertime in New York when I spoke to Kulkarni on the phone. It was early morning the next day in Melbourne, and the doctor was taking a walk on the beach before work; through the cell-phone connection, I could hear the sea wind. Kulkarni’s experiments had gained her so much attention, she told me, that women all over the world — convinced their problems have hormonal causes and unable to get traction with their doctors at home — are seeking her help.

Sometimes women want to make an appointment at her clinic. But often they or their family members just want advice. Recently, a high-ranking member of the Italian military called her. He was desperately worried about his wife. At 50, the wife was a prominent society hostess who enthusiastically supported their “high-flying life,” as Kulkarni put it, “with lots of state dinners.” But very suddenly, over the space of two weeks, the wife began to unravel completely: She heard voices; she believed that her cell phone was bugged; she thought World War III was about to erupt and became super-vigilant, always looking behind her and checking the corners and closets in her house. The husband became anxious at first, then extremely distressed. The wife had no history of mental illness. She had always been fit, fashionable, and energetic, but now she was incoherent. “My hypothesis was that she was definitely in the menopause transition,” Kulkarni said, and the husband confirmed her hunch. The time between his wife’s periods had been lengthening.

Kulkarni then found herself in “a strange, three-way conversation.” She sent the couple to their regular Italian physician and joined them in his office by Skype. Kulkarni suggested to the doctor that he prescribe Tibolone, a tablet containing low doses of estrogen, progesterone, and an androgen, in addition to the antipsychotic she was already taking. The Italian doctor was resistant, but the husband, an influential man, convinced him to prescribe the drug. Kulkarni remembered thinking that a less forceful or prominent person might not have prevailed: “The doctor was having a difficult time making the connection between ‘She’s got psychotic mental illness’ and ‘Why are we treating her this way?’ ”

This is a happy story. The wife got better. Within ten days, she had stopped feeling paranoid, and on Skype “her face looked less worried and tense.” Within eight weeks, she was entirely herself, and when the couple videoconferenced with Kulkarni, the wife was leading the conversation, slapping her husband fondly on the knee when he interrupted.

When I talk to American psychiatrists about Kulkarni’s experiments and raise the notion of treating late-onset schizophrenic psychosis — or indeed any psychosis — in women with estrogen, here is what they say. They urge caution. They say we don’t know what causes schizophrenia, exactly, that estrogen plays a role but no one knows what role. They remind me that many hormones are involved in the menstrual cycle. They say fluctuations in estrogen interact with the brain, but so does aging and so does stress and all of these conditions are present at the onset of menopause, so who’s to say what the real culprit is. They point to the 128 genes connected with schizophrenia and the research linking it possibly to infection or a faulty immune response. They say there have always been vogues in cures for schizophrenia; because it’s so devastating, the temptation to believe in miracles is great. They point to a large study done recently by Israeli and Romanian researchers that failed to replicate Kulkarni’s results. “I do think that estrogen has some connection to schizophrenia, there’s no question about that, but there have been so many false starts,” says Dilip Jeste, a geriatric neuropsychiatrist at the University of California San Diego. “A few will swear by one medication, but the problem comes in generalizing. That doesn’t mean other women should start taking estrogen.” Even David Castle, the discoverer of the second peak, is skeptical of estrogen’s role. “The estrogen story is rather overstated,” he wrote in an email, before adding, “and estrogens have their side effects, as you will be aware.”

Here is what I have to say in response: Estrogen does not have to be a single or straightforward cause of schizophrenia for it to merit the attention of researchers and doctors; it does not have to be a proven miracle cure to represent an opportunity. The treatment of schizophrenia remains today a matter of trial and error, and estrogen represents one promising path forward. Indeed, Sage Therapeutics is currently seeking FDA approval for a progesterone-like synthetic hormone that promises to alleviate postpartum depression, which makes sense. The links between estrogen and mental illness are clear and the prospects for treatment too interesting to dismiss, particularly given how poorly women have been served, over centuries, by doctors who did not want to look squarely at their femaleness in treating them or who saw their problems as expressions of that femaleness and therefore not worth treating.

Sometime between the two World Wars, my great-grandmother died on the operating table after suffering from what my grandfather always vaguely referred to as “female problems.” I have always assumed she was having her uterus removed, but maybe the surgery was for something else and I will never know why she needed an operation or how she died, whether from anesthesia or bleeding or some surgical error. Did she have cancer? A mental illness? Heavy periods? In any case, the mystery surrounding her health is passed down to me. I am an enlightened, empowered, 21st-century woman, but when doctors ask if I have a family history of uterine, ovarian, or cervical cancer, I have to tell them I don’t know.

The stigma associated with women’s menstruation — and hormones and uteruses — goes back at least to Leviticus, which establishes a catalogue of all the ways monthly bleeding makes a woman “unclean.” But menstruation has historically signaled more than just female filth; it has been seen as the cause of women’s anger, volatility, instability, unreliability, weakness, frailty, and neuroticism. The ancient Greeks, including probably Hippocrates, believed the uterus could sometimes become unmoored inside a woman’s body and rattle around, resulting in excessive emotion. The best remedy for this, they thought, was rigorous sex.

Even now, when we know better and pay lip service to the idea that monthly bleeding (and its cessation) is normal and natural, and women laugh over dinner about loving the conveniences of their new period-absorbant underwear, the stigma around menstruation persists and constricts women in two ways. Our legitimate emotions, moods, and reactions are discounted to our hormonal cycles — if we are pissed off or weepy, we are said to be “on the rag,” a diminishment of our anger or distress and a denial of whatever’s troubling us — and at the same time any actual health problems that arise from our reproductive cycles or hormones or organs are disbelieved or minimized. Even in the 21st century, the superstition persists that our uteruses are prone to wandering: The mother of my friend G, with whom I trained for a marathon, told her she shouldn’t run so much lest her uterus “drop.” G’s mother feared that her uterus might collapse or crash somehow into her cervix, causing infertility, pain, and other unspecified misery.

Yet just as our reproductive organs are thought to make us fragile, emotional, and irrational, we are expected to endure their effects on our bodies and minds stoically and without complaint. Boyfriends and husbands perpetuate this bias, but so do doctors, even elite ones. And if menstruation remains taboo, even in an era when little girls strut around wearing T-shirts that read the future is female, then menopause is worse, because the only thing more disgusting and shameful in culture than the manifestations of fertility — the blood and the egg-white discharge and the hormonal cloud — is the absence of all of that. In the Bible, an infertile woman is labeled cursed.

A woman approaching menopause who becomes psychotic, then, is buried in stigma. She is middle-aged, on the path to becoming an invisible, voiceless member of society, beset by “female trouble” — her symptoms likely to be diminished and disbelieved. I spoke to an Englishwoman named Val who became very, very ill as she approached menopause, at first staying up all night enrolling in online theology classes and then depressed to the point of paralysis and suicide; even though she had had postpartum psychosis after the birth of her first child, her doctors did not make the connection or identify fluctuating hormones as a plausible trigger for her mental illness. “We tried really every antidepressant. Nothing worked. I had electric-shock treatment. That didn’t work. Had lithium. Went months and months and months with nothing working. Eventually, we tried a tricyclic antidepressant and that worked,” she says. “Estrogen was the one thing nobody suggested. It seems now with hindsight that hormone treatment might have been the best option.” A woman like Val defies the traditional definitions of psychotic illness, and very few people are engaged in trying to figure out how to help her.

On a bright summer evening, I was walking home from the subway and talking with Pauline Maki on the phone. She works as the director of women’s-health research at the University of Illinois at Chicago, the center Laura Miller helped to establish, and in a long conversation, Maki explained how she views her job as advocacy. “I think we need better science,” she told me. “There is so much paternalism around everything hormonal. I think women need to stand up and demand this kind of work. More women need to tell their stories. You don’t want to pathologize menopause generally; it’s not a problem for most women. But if we fail to recognize that it is a problem for a certain percentage of women, then we do women a tremendous amount of harm.” And it occurred to me that for 40 years, I have kept a private, internal calendar of my moods in correlation to my menstrual cycle, watching my impatience boil the week before my period begins together with the onset of an almost unbearable irritation at being jostled by strangers on the subway. Now, as I approach menopause myself, I would like my doctors to be curious about all that so that in the event of any health crisis, they will see me whole — as a middle-aged woman with a lot to lose — to presume my intelligence and good judgment and to present me or my proxies with a range of options without predispositions for what I should do.

Hormone-replacement therapy, once promoted as a menopause “cure,” a way to rejuvenate and resexualize middle-aged women, became, after the halting of the Women’s Health Initiative study, a cultural sin, widely disapproved of by doctors and wellness bloggers alike. Now women who inquire about hormones for physical or mood symptoms are talked out of them as a matter of course. But the pendulum of public opinion on hormone supplements is starting to swing back, thanks in large part to the determined activism of the trans movement. The truth is the dangers of hormone supplements were overstated in the publicity around the WHI. They are not a universal or certain risk but a dangerous risk in some cases, and with careful oversight some of those risks can be managed. If presented with the awful dilemma of ruinous midlife psychosis or a possible breast-cancer risk, shouldn’t a woman be allowed to choose?

On the day I visit Janet at home, there is a tornado. My phone is blaring warnings as I follow her down back roads, but we keep going, pulling into the parking lot of her condo complex just as the afternoon grows black and the trees blow horizontal and the rain cascades not in drops or sheets but as if the ocean were dumping out of the sky. I wait out the tornado alone in my vehicle, talking to my husband on the phone, while my small car trembles and shakes, until finally the rain abates and Janet knocks on my window. The electricity had gone out in her cluttered apartment so I would need a flashlight to see the pictures on her walls, mostly collages and paintings she had made. She is very lonely, she tells me. Her daughter had just moved to South Carolina, and her relationship to her church community had frayed. “Little by little, one by one, the people in my life have all dropped out, and now I’m back to being kind of solitary again. And I don’t like that,” she says. The voices are still there, Janet tells me, but she is able to dismiss them and they mostly don’t bother her. She takes her Risperdal regularly, in diminishing doses, with a plan to get off it completely. I have to get home, but I can tell she doesn’t want me to leave.

The next day, Janet writes me a long email. After the tornado, and our conversation, she had a difficult night. Her regular isolation had been exacerbated by the blackout, making it difficult to sleep. It is as if in reaching out to me, Janet is searching for a unifying answer that will solve the riddle of her breakdown and cure her loneliness at once. “Here’s my current philosophy,” she said to me when I presented her with the estrogen hypothesis. “Maybe, just maybe, like God says in the Bible, God is in us, and we are in him. Maybe this is the spirit that God leaves with us that’s talking to me. But I don’t think God intended it to talk to me. I think something happens in menopause. We don’t know enough about nutrition or behavior or menopause or premenopause — whatever — to help people process this age in their lives. I have moments when I think it was all just a chemical imbalance. If I hadn’t had the perimenopausal event, I don’t think any of it would have happened. But I also think there’s so much more going on in our brains than we are aware of.

“And I think there is knowledge out there, but I think it’s old, ancient knowledge that has been lost to the generations through the rapid, rapid changes — I’m talking about the past 50 years — and explosions in population. We don’t live the way we used to. We used to live tribally. The tribes could always share. There was a huge close-knit community that could share. I know what we need. I don’t know how to get it, but I know what we need: We need people who understand what is happening to us to sit down with us and explain it. At some level, we could just use someone to say, ‘Okay, you’re 45 years old. You’re perimenopausal. You’re hearing voices. Here’s what’s helped these three dozen other people. Tell me your story.’ ”

*This article appears in the December 24, 2018, issue of New York Magazine. Subscribe Now!

Read the whole story
nocko
16 days ago
reply
Share this story
Delete

Exclave: Hardware Testing in Mass Production, Made Easier

1 Comment and 3 Shares

Reputable factories will test 100% of every product shipped. For example, the computer or phone you’re using to read this has had a plug inserted in every connector, along with dozens of internal and external tests run to confirm everything from the correct operation of the CPU to the proper function of the buttons.


A test station at a motherboard factory (2x speed). Every port and connector gets tested.

Even highly automated processes can yield defective units: entropy happens, and constant vigilance is required to guard against it. Even a very stable manufacturing process with a raw defect rate of around 1% is considered unacceptable by any reputable brand. This is one of the elephants in the digital fabrication room – just because a tool is digital doesn’t mean it will fabricate things perfectly with a push of the button. Every tool needs maintenance, and more often than not a skilled operator is required to inspect the final product and polish over rough edges.

To better grasp the magnitude of the factory test problem, consider the software that’s loaded on your computer. How did it get in there? Devices come out of the silicon foundry mostly blank. They typically don’t even have the innate knowledge to traverse a filesystem, much less connect to the Internet to download an update. Yet everyone has had the experience of waiting for an update to download and install. Factories must orchestrate a much more time-consuming and complicated process to bootstrap every device made, in order for you to enjoy the privilege of connecting to the Internet to download updates.

One might think, “surely, there must be a standardized way for handling this”.

Shockingly, there isn’t.

How Not To Test a Product

Unfortunately, first-time product makers often make the assumption that either products don’t require 100% testing (because the boards are assembled by robots, and robots don’t make mistakes, right?), or there is some otherwise standardized way to handle the initial firmware upload. Once upon a time, I was called upon to intervene on a factory test for an Arduino-derivative product, where the original test specification was literally “plug the device into the USB port of [your] laptop, and type in this AVRDUDE command to load code, and then type in another AVRDUDE command to set the fuses, and then use a multimeter to check the voltages on these two test points”. The test documentation was literally two photographs of the laptop screen and a paragraph of text. The product’s designer argued to the factory that this was sufficient because it it’s really quick and reliable: he does it in under two minutes, how could any competent factory that handles products with AVR chips not have heard of AVRDUDE, and besides he encountered no defects in the half dozen prototypes he produced by hand. This is in addition to an over-arching attitude of “whatever, I’m the smart guy who comes up with the ideas, just get your minimum-wage Chinese laborers to stop messing them up”.

The reality is that asking someone to manually run commands from a shell and read a meter for hours on end while expecting zero defects is neither humane nor practical. Furthermore, assuming the ability and judgment to run command line scripts isn’t realistic; testing is time-consuming, and thus often the least-skilled, lowest wage laborers are employed for the process. Ironically, there is no correlation between the skills required to assemble a computer, and the skills required to operate a computer. Thus, in order for the factory to meet the product designer’s expectation of low labor cost with simultaneously high quality, it’s up to the product designer to come up with an automated, fool-proof test jig.

Introducing the Test Jig: The Product Behind the Product

“Test jig” is a generic term any tool designed to assist with production testing. However, there is a basic format for a test jig chassis, and demand for test jig chassis is so high in places like Shenzhen that entire cottage industries have sprung up to support the demand. Most circuit board test jigs look a bit like this:


Above: NeTV2 circuit board test jig

And the short video below highlights the spring-loaded pogo pins of the test jig, along with how a circuit board is inserted into a test jig and clamped in place for testing.


Above: Inserting an NeTV2 PCB into its test jig.

As you can see in the video, the circuit board is placed into a precision-milled platter that moves along spring-loaded rails, allowing the board to engage with pogo-pin style test points underneath. As test points consume precious space on the circuit board, the overall mechanical accuracy of the system has to be better than +/-1mm once all tolerances are considered over thousands of cycles of wear and tear, in order to keep the test points a reasonable size (under 2mm in diameter).

The specific test jig shown above measures 12 separate DC voltages, performs a basic JTAG ID code check on the FPGA, loads firmware, and tests the on-board DRAM all in under 20 seconds. It’s the preliminary “fast test” of the NeTV2 product, meant to screen out gross solder faults and it provides an estimated coverage of about 80% of the solder joints on the PCB. The remaining 20% of the solder joints belong principally to connectors, which require a much more labor-intensive manual test to check.

Here’s a look inside the test jig:

If it looks complicated, that’s because it is. Test jig complexity is correlated with product complexity, which is why I like to say the test jig is the “product behind the product”. In some cases, a product designer may spend even more time designing a test jig than they spend designing the product itself. There’s a very large space of problems to consider when implementing a test jig, ranging from test coverage to operator fatigue, and of course throughput and reliability.

Here’s a list of the basic issues to consider when designing a test jig:

  • Coverage: How to test every single feature?
  • UX: Who is interpreting your test data? How to internationalize the UI by using symbols and colors instead of text, and how to minimize operator fatigue?
  • Automation: What’s the quickest way to set up and tear down tests? How to avoid relying on human judgment?
  • Audit & traceability: How do you enforce testing standards? How to incorporate logging and coupons to facilitate material traceability?
  • Updates: What do you do when the tester needs a patch or update? How do you keep the test program in lock-step with the current firmware release?
  • Responsibility: Who is responsible for product quality? How do you create a natural incentive to design-for-test from the very first product sketch?
  • Code Structure: How do you maintain the tester’s code base? It’s tempting to think that test jig code should be write-once, since it’s going into a single device with a limited user base. However, the reality of production is rarely so simple, and it pays to structure your code base so that it’s self-checking, modular, reconfigurable, and reliable.

Each of these bullet points are aspects of test jig design that I have learned from the school of hard knocks.

Read on, and avoid my mistakes.

Coverage

Ideally, a tester should cover 100% of the features of a product. But what, exactly, constitutes a feature? I once designed a product called the Chumby One, and I also designed its test procedure. I tried my best to cover all of its features, but I missed one: the power button. It seemed simple enough – just a switch, what could go wrong? It turns out that over the course of production, the tolerance between the mechanical switch pusher and the electrical switch mechanism had drifted to the point where pushing on the cap would not contact the electrical switch itself, leading to a cohort of returns from that production lot.

Even the simplest of mechanisms is a feature that needs to be tested.

Since that experience, I’ve adopted an “inside/outside” methodology to derive the test feature list. First, I look “inside” the product, going through the schematic and picking key features for testing. The priority is to check for solder faults as quickly as possible, based on the assumption that the constituent components are 100% pre-tested and reliable. Then, I look at the product from the “outside”, as a consumer might approach it. First, I look at the marketing brochure and see what was promised: “world class WiFi performance” demands a different level of test from “product has WiFi”. Then, I try to imagine all the ways a customer might interact with the product – such as pressing the power button – and add those points to the test list. This means every connector needs to have something stuffed in it, every switch pressed, every indicator light must get checked.


Red arrow calls out the mechanical switch pusher that drifted out of tolerance with the corresponding electrical switch

UX

Test jig UX can have a large impact on test throughput and reliability; test operators are human, and like all humans are susceptible to fatigue and boredom. A startup I worked with once told me a story of how a simple UX change drastically improved test throughput. They had a test that would take 10 minutes on average to run, so in order to achieve a net throughput of around 1 minute per unit, they provided the factory 10 testers. Significantly, the test run-time would vary from unit to unit, with a variance of several minutes from unit to unit. Unfortunately, the only indicator of test state was a single light that could either flash or change color. Furthermore, the lighting pattern of units that failed testing bore a resemblance to units that were still running the test, so even when the operator noticed a unit that finished testing, they would often overlook failed units, assuming they were still running the test. As a result, the actual throughput achieved on their first production run was about one unit every 5 minutes — driving up labor costs dramatically.

Once the they refactored the UX to include an audible chime that would play when the test was finished, aggregate test cycle time dropped to a bit over a minute – much closer to the original estimate.

Thus, while one might think UX is just for users, I’ve found it pays to make wireframes and mock-ups for the tester itself, and to spend some developer cycles to create an operator-friendly test program. In some ways, tester UX design is more challenging than the product UX: ideally, you’re creating a UX with icons that are internationally recognizeable, using little or no text, so operators anywhere in the world can just sit down and use it with no special training. Furthermore, you’re trying to create user engagement with something as banal as a test – something that’s literally as boring as watching paint dry. I’ve even gone so far as putting a mini-game in the middle of a long test sequence to keep operators attentive. The mini-game was of course directly relevant to the testing certain hardware sensors, but it was surprisingly effective because the operators would race each other on the mini-game to see who could finish the fastest, boosting throughput and increasing worker happiness.

At the end of the day, factories are powered by humans, and it pays to employ a human-first design process when crafting test programs.

Automation

Human operators are prone to error. The more a test can be automated, the more reliable it can be, and in the long run automation will save money. I once visited a large mobile phone maker’s factory, and witnessed a gymnasium-sized room full of test stations replaced by a pair of fully robotic test stations. Instead of hundreds of operators plugging cables in and checking aspects like screen and camera quality, a delicate ballet of robotic actuators would plug connectors into every port in a fraction of a second, and every feature of the phone from the camera to the GPS is tested in a couple of minutes. The test stations apparently cost about a million dollars to develop, but the empty cavern of idle test jigs sitting next to it was clear testament to the labor cost savings of such a high degree of automation.

At the smaller scales more typical of startups, automation can happen but it needs to be judiciously applied. Every robotic actuator takes time and money to develop, and they are also prone to wear-out and eventual failure. For the Chibitronics Chibi Chip product, there’s a single mechanical switch on the board, and we developed a simple servo mechanism to actuate the plunger. However, despite using a series-elastic spring and a foam pad to avoid over-stressing the servo motor, over time, we’ve found the motor still fails, and operators have disconnected it in favor of manually pushing the button at the right time.


The Chibi Chip test jig


Detail view of the reset switch servo

Indicator lights can also be tricky to test because the lighting conditions in a factory can be highly variable. Sometimes the floor is flooded by sunlight; other times, it’s lit by dim fluorescent lamps or LED bulbs, each with distinct noise signatures. A simple photodetector will be unreliable unless you can perfectly shield the device under test (DUT) from stray light sources. However, if the product’s LEDs can be modulated (with a PWM waveform, for example), the modulation can be detected through an AC-coupled photodetector. This system tends to be more reliable as the AC coupling rejects sunlight, and the modulation frequency can be chosen to be distinct from other stray light noise sources in the factory.

In general, the gold standard for test automation is to put the DUT into a jig, press a button, wait, and then a red or green light indicates if the device passes or fails. For simple products, this should be achievable, but reasonable exceptions should be made depending upon the resources available in a startup to implement tests versus the potential frequency and impact of a particular feature escaping the test process. For example, in the case of NeTV2, the functionality of indicator LEDs and the fan are visually inspected by the operator; but in my judgment, all the components involved have generous tolerances and are less likely to be assembled incorrectly, and there are other points downstream of the PCB test during the assembly process where the LEDs and fan operation will be checked yet again, further reducing the likelihood of these features escaping the test process.

Audit and Traceability

Here’s a typical failure scenario at a factory: one operator is running two testers in parallel. The lunch bell rings, and the operator gets up and leaves without noting the status of the test (if you’ve been doing the same thing over and over for the past four hours and running on an empty belly, you’d do the same thing too). After lunch, the operator sits down again, and has to recall whether the units in front of her have been tested or not. As a result of this arbitrary judgment call, sometimes units that didn’t pass test, or weren’t even tested at all, slip into the tested product bins after a shift change.

This is one of the many reasons why it pays to incorporate some sort of audit and traceability program into the tester and product itself. The exact nature of the program will depend greatly upon the exact nature of the product and amount of developer resources available, but a simple example is structuring the test program so that a serial number isn’t generated for the product until all the tests pass – thus, the serial number is a kind of “coupon” to prove the unit has passed test. In the operator-returning-from-lunch scenario, she just has to check for the presence of a serial number to determine the testing state of a particular unit.


The Chibi Chip uses Bitmarks as a coupon to indicate when they have passed test. The Bitmarks also help prevent warranty fraud and deters cloning.

Sometimes I also burn a log of the test into the product itself. It’s important to make the log a circular buffer that can store more than one test run, because often times products that fail test the first time must be retested several times as it’s reworked and repaired. This way, if a product is returned by a user, I can query the log and see a fairly complete history of the product’s rework experience in the factory. This is incredibly helpful in debugging factory process issues and holding the factory accountable for marginal practices such as re-testing a device multiple times without repairing it, with the hope that they get lucky and get a “pass” out of the tester due to random environmental fluctuations.

Ideally, these logs are sent up to the cloud or a server directly, but that will depend heavily upon the reliability of the Internet connectivity at your facility. Internet is notoriously unreliable in China, especially to servers not located on the mainland, and so sometimes a small startup with limited resources has to make compromises about the extent and nature of audit and traceability achievable on the factory floor.

Updates

Consumer electronic products are increasingly just software wrapped in a plastic shell. While the hardware itself must stabilize months before production, the software in a product continues to evolve, especially in Internet-connected products that support over-the-air updates. Sometimes patches to a product’s firmware can profoundly alter low-level APIs, breaking the factory test program. For example, I had a product once where the audio drivers went through a major upgrade, going from OSS to ALSA. This changed the way the microphone subsystem was accessed, causing the microphone test to fail in production. Thus user firmware updates can also necessitate a tester program update.

If a test jig was engineered as a stand-alone box that requires logging into a terminal to upgrade, every time the software team pushes an update, guess what – you’re hopping on a plane to the factory to log in to the test jig and upgrade it. This is not a sustainable upgrade plan for products that have complex, constantly evolving internal firmware; thus, as the test jig designer, it’s well-advised to build a secure remote upgrade process into the test jig itself.


That’s me about 12 years ago on a factory floor at 2AM debugging a testjig update gone wrong, bringing production to a screeching halt. Don’t be like me; you can do better!

In addition a remote upgrade mechanism, you’re going to need a way to validate the test jig update without having to bring down a production line. In order to help with this, I always keep a physical copy of the production test jig in my office, so I can validate testjig updates from the comfort of my office before pushing them to the production floor. I try my best to keep the local jig an exact copy of what’s on the line; this may involve taking snapshots of the firmware image or swapping out OS drives between development and production versions, or deliberately breaking features that have somehow failed on the production jigs. This process is inspired by the engineers at JPL and NASA who keep an exact copy of Mars-based rovers on Earth, so they can thoroughly test an update before pushing it to the rover on Mars. While this discipline can be inconvenient and incurs the cost of an extra test jig, it’s inevitably cheaper than having to book a last minute flight to your factory to fix things because of an update gone wrong.

As for the upgrade mechanism itself, how fancy and secure you want to get has virtually no limit; I’ve done everything from manual swaps of USB thumb drives that contain the tester configuration data to a private VPN via a dedicated 3G-to-wifi gateway deployed at the factory site. The nature of the product (e.g. does it contain security keys, how often is the product firmware updated) and the funding level of your organization will heavily influence the architecture of the upgrade process.

Responsibility

Given how much effort it takes to build a good test jig, it’s tempting to free up precious developer resources by simply outsourcing the test jig to a third party. I’ve almost never found this to be a good idea. First of all, nobody but the developer knows what skeletons are hidden in a product’s closet. There’s what’s written in the spec, but then there is how faithfully the spec was implemented. Of course, in an ideal world, all specs were perfectly met, but only the developer has a true sense of how spot-on the implementation ended up. This drives the second point, which is avoiding the blame game. By throwing tests over the fence to a third party, if a test isn’t easy to implement or is generating false results, it’s easy to get into a finger-pointing exercise over who is at fault: the developer for not meeting the specs, or the test developer for not being creative enough to implement the test without necessitating design changes.

However, when the developer knows they are ultimately on the hook for the test jig, from day one the developer thinks about design for test. Where will the test points go? How do we make internal state easily visible? What bring-up sequence gives us the most test coverage in the shortest amount of time? By making the developer responsible for the test jig, the test program comes together as the product matures. Bring-up scripts used to validate the product are quickly converted to factory tests, and overall the product achieves a higher standard of testability while saving the money and resources that would otherwise be spent trying to coordinate between two parties with conflicting self-interests.

Code Structure

It’s tempting to think about a test jig as a pile of write-once code that doesn’t need to be maintainable. For simple products, one can definitely get away with this mentality. However, I’ve been bitten more than once by fragile code bases inside production testers. The most typical scenario where things break is when I have to change the order of tests, in order to prioritize testing problematic features first. It doesn’t make sense to test a dozen high-yielding features before running a test on a feature with a known yield issue. That just wastes operator time, and runs up the cost of production.

It’s also hard to predict before production what the most frequent mode of failure would be – after all, any failures you could have anticipated would already be designed out! So, quite often in the middle of an early production run, I’m challenged with having to change the order of tests in a complex sequence of tests to optimize operator time and improve production throughput.

Tests almost always have dependencies – you have to power on the board before you can flash the firmware; you need firmware before you can connect to wifi; you need credentials to connect to wifi; you have to clean up the test credentials before shipping the product. However, if the process that cleans up the test credentials is also responsible for cleaning up any other temporary tester files (for example, a flag that also sets Bluetooth into test mode), moving the wifi test sequence earlier could result in tester configuration files being left on the customer image, potentially leading to unexpected behaviors (such as Bluetooth still being in test mode in the shipping product!).

Thus, it’s helpful to have some infrastructure for tests that keeps each test modular while enforcing dependencies. Although one could write this code every single time from scratch, we encounter this problem so regularly that Sean ‘Xobs’ Cross set out to create a testjig management system to solve this problem “once and for all”. The result is a project he calls Exclave, with the idea being that Exclave – like an actual geographical exclave – is a tiny bit of territory that you can retain control of inside a foreign factory.

Introducing Exclave

Exclave is a scaffold designed to give structure to an otherwise amorphous blob of test code, while minimizing the amount of overhead required of the product designer to achieve this structure. The basic features of Exclave are as follows:

  • Code Re-use. During product bring-up, designers write simple scripts to validate each feature individually. Exclave attempts to re-use these scripts by making no assumption about the language used to write them. Python, C, Bash, Node.js, Rust – all are welcome, so long as they run on a command line and can return an exit code.
  • Automated dependency resolution. Each test routine is associated with a “.test” descriptor which describes the dependencies and timeout for a given script, which are then automatically resolved by Exclave.
  • Scenario management. Test descriptors are strung together into scenarios, which can be selected dynamically based on the real-time requirements of the factory.
  • Triggers. Typically a test is started by pressing a button, but Exclave’s flexible triggering system also allows tests to start on other cues, such as hot-plug events.
  • Multiple UI targets. Test jig UI can range from a red/green light to a serial console device to a full graphical interface running on a monitor. Exclave has a system for interpreting test results and driving multiple UI sinks. This allows for fast product debugging by attaching a GUI (via an HDMI monitor or laptop) while maintaining compatibility with cost-efficient LED indicators favored for production scale-up.


Above: Exclave helps migrate lab-bench validation code to production-grade factory tests.

To get a little flavor on what Exclave looks like in practice, let’s look at a couple of the tests implemented in the NeTV2 production test flow. First, the production test is split into two repositories: the test descriptors, and the graphical UI. Note that by housing all the tests in github, we also solve the tester upgrade problem by providing the factory with a set git repo management scripts mapped to double-clickable desktop icons.

These repositories are installed on a Raspberry Pi contained within the test jig, and Exclave is started on boot as a systemd service. The service runs a simple script that fires up Exclave in a target directory which contains a “.jig” file. The “netv2.jig” file specifies the default scenario, among other things.

Here’s an example of what a quick test scenario looks like:

This scenario runs a variety of scripts in different languages that: turn on the device (bash/C), checks voltages (C), checks ID code of the FPGA (bash/openOCD), loads a test bitstream (bash/openOCD), checks that the REPL shell can start on the FPGA (Expect/TCL), and then runs a RAM test (Expect/TCL) before shutting the board down (bash/C). Many of these scripts were copied directly from code used during board bring-up and system validation.

A basic operation that’s surprisingly tricky to do right is checking for terminal interaction (REPL shell) via serial port. Writing a C or bash script that does this correctly and gracefully handles all error cases is hard, but fortunately someone already solved this problem with the “Expect” TCL extension. Here’s what the REPL shell test descriptor looks like in Exclave:

As you can see, this points to a couple other tests as dependencies, sets a time-out, and also designates the location of the Expect script.

And this is what the Expect script looks like:

This one is a bit more specialized to the NeTV2, but basically, it looks for the NeTV2 tester firmware shell prompt, which is “TESTER_NX8D>”; the system will attempt to recover this prompt by sending a carriage-return sequence once every two seconds and searching for this special string in return. If it receives the string “BIOS” instead, this indicates that the NeTV2 failed to boot and escaped into the ROM BIOS, probably due to a RAM error; at which point, the Expect script prints a bunch of JSON, which is automatically passed up to the UI layer by Exclave to create a human-readable error message.

Which brings us to the interface layer. The NeTV2 jig has two options for UI: a set of LEDs, or an HDMI monitor. In an ideal world, the total amount of information an operator needs to know about a board is if it passed or failed – a green or red LED. Multiple instances of the test jig are needed when a product enters high volume production (thousands of units per day), so the cost of each test jig becomes a factor during production scale-up. LEDs are orders of magnitude cheaper than an HDMI monitor, and in general a test jig will cost less than an HDMI monitor. So LEDs instead of an HDMI monitor for UI can dramatically slash the cost to scale up production. On the other hand, a pair of LEDs does not give enough information to diagnose what’s gone wrong with a bad board. In a volume production scenario, one would typically collect the (hopefully small) fraction of failed boards and bring them to a secondary station where a more skilled technician debugs them. Exclave allows the same jig used in production to be placed at the debug station, but with an HDMI monitor attached to provide valuable detailed error reports.

With Exclave, both UI are integrated seamlessly using “.interface” files. Below is an example of the .interface file that starts up the http daemon to enable JSON debugging via an HDMI monitor.

In a nutshell, Exclave contains an event reporting system, which logs events in a fashion similar to Linux kernel messages. Events are tagged with metadata, such as severity, and the events are broadcast to interface handlers that further refine them for the respective UI element. In the case of the LEDs, it just listens for “START” [a scenario], “FAIL” [a test], and “FINISH” [a scenario] events, and ignores everything else. In the case of the HDMI interface, a browser configured to run in kiosk mode is pointed to the correct localhost webpage, and a jquery-based HTML document handles the dynamic generation of the UI based upon detailed messages from Exclave. Below is a screenshot of what the UI looks like in action.

The UI is deliberately brutalist in design, using color to highlight only the most important messages, and also includes audible alerts so that operators can zone out while the test runs.

As you can see, the NeTV2 production tester tests everything – from the LEDs to the Ethernet, to features that perhaps few people will ever use, such as the SD card slot and every single GPIO pin. Thanks to Exclave, I was able to get this complex set of tests up and running in under a month: the first code commit was made on Oct 13, 2018, and by Nov 7, I was largely just tweaking tests for performance, and to reflect operational realities discovered on the factory floor.

Also, for the hardware-curious, I did design a custom “hat” for the Raspberry Pi to add several ADC channels and various connectors to facilitate testing. You can check out the source for the tester hat at the Alphamax github repo. I had six of these boards built; five of them have found their way into various parts of the NeTV2 production flow, and if I still have one spare after production is stabilized, I’m planning on installing a replica of a tester at HAX in Shenzhen. That way, those curious to find out more about Exclave can walk up to the tester, log into it, and poke around (assuming HAX agrees to this).

Let’s Stop Re-Inventing the Test Jig!
The unspoken secret of hardware is that behind every product, there’s a robust test jig making sure that every unit shipped to end customers meets quality standards. Hardware startups that don’t anticipate the importance and difficulty of creating such a tester often encounter acute (and sometimes fatal) growing pains. Anytime I build more than a few copies of a piece of hardware, I know I’m going to need a test jig – even for bespoke, short-run products like a conference badge.

After spending months of agony re-inventing the wheel every time we shipped a product, Xobs decided to create Exclave. It’s still a work in progress, but by now it’s been used as the production test infrastructure for several volume products, including the Chibi Chip, Chibi Scope, Tomu, The Phage Blinky Badge, and now NeTV2 (those are all links to the actual Exclave test scripts for each of the respective products — open source ftw!). I feel Exclave has come along far enough that it’s time to invite more users to join the Exclave community and give it a try. The code is located on github and is 100% open source, and it’s written in Rust entirely by Xobs. It’s my hope that Exclave can mature into a tool and a community that will save countless Makers and small hardware startups the teething pains of re-inventing the test jig.


Production-proven testjigs that run Exclave. Clockwise from top-right: NeTV2, Chibi Chip, Chibi Scope, Tomu, and The Phage Blinky Badge. The badge tester has even survived a couple of weeks exposed to the harsh elements of the desert as a DIY firmware updating station!

Read the whole story
nocko
26 days ago
reply
Amazing!
Share this story
Delete

Falling in love with Rust

1 Comment and 2 Shares
Comments

Falling in love with Rust

Let me preface this with an apology: this is a technology love story, and as such, it’s long, rambling, sentimental and personal. Also befitting a love story, it has a When Harry Met Sally feel to it, in that its origins are inauspicious…

First encounters

Over a decade ago, I worked on a technology to which a competitor paid the highest possible compliment: they tried to implement their own knockoff. Because this was done in the open (and because it is uniquely mesmerizing to watch one’s own work mimicked), I spent way too much time following their mailing list and tracking their progress (and yes, taking an especially shameful delight in their occasional feuds). On their team, there was one technologist who was clearly exceptionally capable — and I confess to being relieved when he chose to leave the team relatively early in the project’s life. This was all in 2005; for years for me, Rust was “that thing that Graydon disappeared to go work on.” From the description as I read it at the time, Graydon’s new project seemed outrageously ambitious — and I assumed that little would ever come of it, though certainly not for lack of ability or effort…

Fast forward eight years to 2013 or so. Impressively, Graydon’s Rust was not only still alive, but it had gathered a community and was getting quite a bit of attention — enough to merit a serious look. There seemed to be some very intriguing ideas, but any budding interest that I might have had frankly withered when I learned that Rust had adopted the M:N threading model — including its more baroque consequences like segmented stacks. In my experience, every system that has adopted the M:N model has lived to regret it — and it was unfortunate to have a promising new system appear to be ignorant of the scarred shoulders that it could otherwise stand upon. For me, the implications were larger than this single decision: I was concerned that it may be indicative of a deeper malaise that would make Rust a poor fit for the infrastructure software that I like to write. So while impressed that Rust’s ambitious vision was coming to any sort of fruition at all, I decided that Rust wasn’t for me personally — and I didn’t think much more about it…

Some time later, a truly amazing thing happened: Rust ripped it out. Rust’s reasoning for removing segmented stacks is a concise but thorough damnation; their rationale for removing M:N is clear-eyed, thoughtful and reflective — but also unequivocal in its resolve. Suddenly, Rust became very interesting: all systems make mistakes, but few muster the courage to rectify them; on that basis alone, Rust became a project worthy of close attention.

So several years later, in 2015, it was with great interest that I learned that Adam started experimenting with Rust. On first read of Adam’s blog entry, I assumed he would end what appeared to be excruciating pain by deleting the Rust compiler from his computer (if not by moving to a commune in Vermont) — but Adam surprised me when he ended up being very positive about Rust, despite his rough experiences. In particular, Adam hailed the important new ideas like the ownership model — and explicitly hoped that his experience would serve as a warning to others to approach the language in a different way.

In the years since, Rust continued to mature and my curiosity (and I daresay, that of many software engineers) has steadily intensified: the more I have discovered, the more intrigued I have become. This interest has coincided with my personal quest to find a programming language for the back half of my career: as I mentioned in my Node Summit 2017 talk on platform as a reflection of values, I have been searching for a language that reflects my personal engineering values around robustness and performance. These values reflect a deeper sense within me: that software can be permanent — that software’s unique duality as both information and machine afford a timeless perfection and utility that stand apart from other human endeavor. In this regard, I have believed (and continue to believe) that we are living in a Golden Age of software, one that will produce artifacts that will endure for generations. Of course, it can be hard to hold such heady thoughts when we seem to be up to our armpits in vendored flotsam, flooded by sloppy abstractions hastily implemented. Among current languages, only Rust seems to share this aspiration for permanence, with a perspective that is decidedly larger than itself.

Taking the plunge

So I have been actively looking for an opportunity to dive into Rust in earnest, and earlier this year, one presented itself: for a while, I have been working on a new mechanism for system visualization that I dubbed statemaps. The software for rendering statemaps needs to inhale a data stream, coalesce it down to a reasonable size, and render it as a dynamic image that can be manipulated by the user. This originally started off as being written in node.js, but performance became a problem (especially for larger data sets) and I did what we at Joyent have done in such situations: I rewrote the hot loop in C, and then dropped that into a node.js add-on (allowing the SVG-rendering code to remain in JavaScript). This was fine, but painful: the C was straightforward, but the glue code to bridge into node.js was every bit as capricious, tedious, and error-prone as it has always been. Given the performance constraint, the desire for the power of a higher level language, and the experimental nature of the software, statemaps made for an excellent candidate to reimplement in Rust; my intensifying curiosity could finally be sated!

As I set out, I had the advantage of having watched (if from afar) many others have their first encounters with Rust. And if those years of being a Rust looky-loo taught me anything, it’s that the early days can be like the first days of snowboarding or windsurfing: lots of painful falling down! So I took deliberate approach with Rust: rather than do what one is wont to do when learning a new language and tinker a program into existence, I really sat down to learn Rust. This is frankly my bias anyway (I always look for the first principles of a creation, as explained by its creators), but with Rust, I went further: not only did I buy the canonical reference (The Rust Programming Language by Steve Klabnik, Carol Nichols and community contributors), I also bought an O’Reilly book with a bit more narrative (Programming Rust by Jim Blandy and Jason Orendorff). And with this latter book, I did something that I haven’t done since cribbing BASIC programs from Enter magazine back in the day: I typed in the example program in the introductory chapters. I found this to be very valuable: it got the fingers and the brain warmed up while still absorbing Rust’s new ideas — and debugging my inevitable transcription errors allowed me to get some understanding of what it was that I was typing. At the end was something that actually did something, and (importantly), by working with a program that was already correct, I was able to painlessly feel some of the tremendous promise of Rust.

Encouraged by these early (if gentle) experiences, I dove into my statemap rewrite. It took a little while (and yes, I had some altercations with the borrow checker!), but I’m almost shocked about how happy I am with the rewrite of statemaps in Rust. Because I know that many are in the shoes I occupied just a short while ago (namely, intensely wondering about Rust, but also wary of its learning curve — and concerned about the investment of time and energy that climbing it will necessitate), I would like to expand on some of the things that I love about Rust other than the ownership model. This isn’t because I don’t love the ownership model (I absolutely do) or that the ownership model isn’t core to Rust (it is rightfully thought of as Rust’s epicenter), but because I think its sheer magnitude sometimes dwarfs other attributes of Rust — attributes that I find very compelling! In a way, I am writing this for my past self — because if I have one regret about Rust, it’s that I didn’t see beyond the ownership model to learn it earlier.

I will discuss these attributes in roughly the order I discovered them with the (obvious?) caveat that this shouldn’t be considered authoritative; I’m still very much new to Rust, and my apologies in advance for any technical details that I get wrong!

1. Rust’s error handling is beautiful

The first thing that really struck me about Rust was its beautiful error handling — but to appreciate why it so resonated with me requires some additional context. Despite its obvious importance, error handling is something we haven’t really gotten right in systems software. For example, as Dave Pacheo observed with respect to node.js, we often conflate different kinds of errors — namely, programmatic errors (i.e., my program is broken because of a logic error) with operational errors (i.e., an error condition external to my program has occurred and it affects my operation). In C, this conflation is unusual, but you see it with the infamous SIGSEGV signal handler that has been known to sneak into more than one undergraduate project moments before a deadline to deal with an otherwise undebuggable condition. In the Java world, this is slightly more common with the (frowned upon) behavior of catching java.lang.NullPointerException or otherwise trying to drive on in light of clearly broken logic. And in the JavaScript world, this conflation is commonplace — and underlies one of the most serious objections to promises.

Beyond the ontological confusion, error handling suffers from an infamous mechanical problem: for a function that may return a value but may also fail, how is the caller to delineate the two conditions? (This is known as the semipredicate problem after a Lisp construct that suffers from it.) C handles this as it handles so many things: by leaving it to the programmer to figure out their own (bad) convention. Some use sentinel values (e.g., Linux system calls cleave the return space in two and use negative values to denote the error condition); some return defined values on success and failure and then set an orthogonal error code; and of course, some just silently eat errors entirely (or even worse).

C++ and Java (and many other languages before them) tried to solve this with the notion of exceptions. I do not like exceptions: for reasons not dissimilar to Dijkstra’s in his famous admonition against “goto”, I consider exceptions harmful. While they are perhaps convenient from a function signature perspective, exceptions allow errors to wait in ambush, deep in the tall grass of implicit dependencies. When the error strikes, higher-level software may well not know what hit it, let alone from whom — and suddenly an operational error has become a programmatic one. (Java tries to mitigate this sneak attack with checked exceptions, but while well-intentioned, they have serious flaws in practice.) In this regard, exceptions are a concrete example of trading the speed of developing software with its long-term operability. One of our deepest, most fundamental problems as a craft is that we have enshrined “velocity” above all else, willfully blinding ourselves to the long-term consequences of gimcrack software. Exceptions optimize for the developer by allowing them to pretend that errors are someone else’s problem — or perhaps that they just won’t happen at all.

Fortunately, exceptions aren’t the only way to solve this, and other languages take other approaches. Closure-heavy languages like JavaScript afford environments like node.js the luxury of passing an error as an argument — but this argument can be ignored or otherwise abused (and it’s untyped regardless), making this solution far from perfect. And Go uses its support for multiple return values to (by convention) return both a result and an error value. While this approach is certainly an improvement over C, it is also noisy, repetitive and error-prone.

By contrast, Rust takes an approach that is unique among systems-oriented languages: leveraging first algebraic data types — whereby a thing can be exactly one of an enumerated list of types and the programmer is required to be explicit about its type to manipulate it — and then combining it with its support for parameterized types. Together, this allows functions to return one thing that’s one of two types: one type that denotes success and one that denotes failure. The caller can then pattern match on the type of what has been returned: if it’s of the success type, it can get at the underlying thing (by unwrapping it), and if it’s of the error type, it can get at the underlying error and either handle it, propagate it, or improve upon it (by adding additional context) and propagating it. What it cannot do (or at least, cannot do implicitly) is simply ignore it: it has to deal with it explicitly, one way or the other. (For all of the details, see Recoverable Errors with Result.)

To make this concrete, in Rust you end up with code that looks like this:

fn do_it(filename: &str) -> Result<(), io::Error> {
    let stat = match fs::metadata(filename) {
        Ok(result) => { result },
        Err(err) => { return Err(err); }
    };                  

    let file = match File::open(filename) {
        Ok(result) => { result },
        Err(err) => { return Err(err); }
    };

    /* ... */

    Ok(())
}

Already, this is pretty good: it’s cleaner and more robust than multiple return values, return sentinels and exceptions — in part because the type system helps you get this correct. But it’s also verbose, so Rust takes it one step further by introducing the propagation operator: if your function returns a Result, when you call a function that itself returns a Result, you can append a question mark on the call to the function denoting that upon Ok, the result should be unwrapped and the expression becomes the unwrapped thing — and upon Err the error should be returned (and therefore propagated). This is easier seen than explained! Using the propagation operator turns our above example into this:

fn do_it_better(filename: &str) -> Result<(), io::Error> {
    let stat = fs::metadata(filename)?;
    let file = File::open(filename)?;

    /* ... */

    Ok(())
}

This, to me, is beautiful: it is robust; it is readable; it is not magic. And it is safe in that the compiler helps us arrive at this and then prevents us from straying from it.

Platforms reflect their values, and I daresay the propagation operator is an embodiment of Rust’s: balancing elegance and expressiveness with robustness and performance. This balance is reflected in a mantra that one hears frequently in the Rust community: “we can have nice things.” Which is to say: while historically some of these values were in tension (i.e., making software more expressive might implicitly be making it less robust or more poorly performing), through innovation Rust is finding solutions that don’t compromise one of these values for the sake of the other.

2. The macros are incredible

When I was first learning C, I was (rightly) warned against using the C preprocessor. But like many of the things that we are cautioned about in our youth, this warning was one that the wise give to the enthusiastic to prevent injury; the truth is far more subtle. And indeed, as I came of age as a C programmer, I not only came to use the preprocessor, but to rely upon it. Yes, it needed to be used carefully — but in the right hands it could generate cleaner, better code. (Indeed, the preprocessor is very core to the way we implement DTrace’s statically defined tracing.) So if anything, my problems with the preprocessor were not its dangers so much as its many limitations: because it is, in fact, a preprocessor and not built into the language, there were all sorts of things that it would never be able to do — like access the abstract syntax tree.

With Rust, I have been delighted by its support for hygienic macros. This not only solves the many safety problems with preprocessor-based macros, it allows them to be outrageously powerful: with access to the AST, macros are afforded an almost limitless expansion of the syntax — but invoked with an indicator (a trailing bang) that makes it clear to the programmer when they are using a macro. For example, one of the fully worked examples in Programming Rust is a json! macro that allows for JSON to be easy declared in Rust. This gets to the ergonomics of Rust, and there are many macros (e.g., format!, vec!, etc.) that make Rust more pleasant to use.

Another advantage of macros: they are so flexible and powerful that they allow for effective experimentation. For example, the propagation operator that I love so much actually started life as a try! macro; that this macro was being used ubiquitously (and successfully) allowed a language-based solution to be considered. Languages can be (and have been!) ruined by too much experimentation happening in the language rather than in how it’s used; through its rich macros, it seems that Rust can enable the core of the language to remain smaller — and to make sure that when it expands, it is for the right reasons and in the right way.

3. format! is a pleasure

Okay, this is a small one but it’s (another) one of those little pleasantries that has made Rust really enjoyable. Many (most? all?) languages have an approximation or equivalent of the venerable sprintf, whereby variable input is formatted according to a format string. Rust’s variant of this is the format! macro (which is in turn invoked by println!, panic!, etc.), and (in keeping with one of the broader themes of Rust) it feels like it has learned from much that came before it. It is type-safe (of course) but it is also clean in that the {} format specifier can be used on any type that implements the Display trait. I also love that the {:?} format specifier denotes that the argument’s Debug trait implementation should be invoked to print debug output. More generally, all of the format specifiers map to particular traits, allowing for an elegant approach to an historically grotty problem. There are a bunch of other niceties, and it’s all a concrete example of how Rust uses macros to deliver nice things without sullying syntax or otherwise special-casing. None of the formatting capabilities are unique to Rust, but that’s the point: in this (small) domain (as in many) Rust feels like a distillation of the best work that came before it. As anyone who has had to endure one of my talks can attest, I believe that appreciating history is essential both to understand our present and to map our future. Rust seems to have that perspective in the best ways: it is reverential of the past without being incarcerated by it.

4. include_str! is a godsend

One of the filthy aspects of the statemap code is that it is effectively encapsulating another program — a JavaScript program that lives in the SVG to allow for the interactivity of the statemap. This code lives in its own file, which the statemap code should pass through to the generated SVG. In the node.js/C hybrid, I am forced to locate the file in the filesystem — which is annoying because it has to be delivered along with the binary and located, etc. Now Rust — like many languages (including ES6) — has support for raw-string literals. As an aside, it’s interesting to see the discussion leading up to its addition, and in particular, how a group of people really looked at every language that does this to see what should be mimicked versus what could be improved upon. I really like the syntax that Rust converged on: r followed by one or more octothorpes followed by a quote to begin a raw string literal, and a quote followed by a matching number of octothorpes followed to end a literal, e.g.:

    let str = r##""What a curious feeling!" said Alice"##;

This alone would have allowed me to do what I want, but still a tad gross in that it’s a bunch of JavaScript living inside a raw literal in a .rs file. Enter include_str!, which allows me to tell the compiler to find the specified file in the filesystem during compilation, and statically drop it into a string variable that I can manipulate:

        ...
        /*
         * Now drop in our in-SVG code.
         */
        let lib = include_str!("statemap-svg.js");
        ...

So nice! Over the years I have wanted this many times over for my C, and it’s another one of those little (but significant!) things that make Rust so refreshing.

5. Serde is stunningly good

Serde is a Rust crate that allows for serialization and deserialization, and it’s just exceptionally good. It uses macros (and, in particular, Rust’s procedural macros) to generate structure-specific routines for serialization and deserialization. As a result, Serde requires remarkably little programmer lift to use and performs eye-wateringly well — a concrete embodiment of Rust’s repeated defiance of the conventional wisdom that programmers must choose between abstractions and performance!

For example, in the statemap implementation, the input is concatenated JSON that begins with a metadata payload. To read this payload in Rust, I define the structure, and denote that I wish to derive the Deserialize trait as implemented by Serde:

#[derive(Deserialize, Debug)]
#[allow(non_snake_case)]
struct StatemapInputMetadata {
    start: Vec,
    title: String,
    host: Option,
    entityKind: Option,
    states: HashMap,
}

Then, to actually parse it:

     let metadata: StatemapInputMetadata = serde_json::from_str(payload)?;

That’s… it. Thanks to the magic of the propagation operator, the errors are properly handled and propagated — and it has handled tedious, error-prone things for me like the optionality of certain members (itself beautifully expressed via Rust’s ubiquitous Option type). With this one line of code, I now (robustly) have a StatemapInputMetadata instance that I can use and operate upon — and this performs incredibly well on top of it all. In this regard, Serde represents the best of software: it is a sophisticated, intricate implementation making available elegant, robust, high-performing abstractions; as legendary White Sox play-by-play announcer Hawk Harrelson might say, MERCY!

6. I love tuples

In my C, I have been known to declare anonymous structures in functions. More generally, in any strongly typed language, there are plenty of times when you don’t want to have to fill out paperwork to be able to structure your data: you just want a tad more structure for a small job. For this, Rust borrows an age-old construct from ML in tuples. Tuples are expressed as a parenthetical list, and they basically work as you expect them to work in that they are static in size and type, and you can index into any member. For example, in some test code that needs to make sure that names for colors are correctly interpreted, I have this:

        let colors = vec![
            ("aliceblue", (240, 248, 255)),
            ("antiquewhite", (250, 235, 215)),
            ("aqua", (0, 255, 255)),
            ("aquamarine", (127, 255, 212)),
            ("azure", (240, 255, 255)),
            /* ... */
        ];

Then colors[2].0 (say) which will be the string “aqua”; (colors[1].1).2 will be the integer 215. Don’t let the absence of a type declaration in the above deceive you: tuples are strongly typed, it’s just that Rust is inferring the type for me. So if I accidentally try to (say) add an element to the above vector that contains a tuple of mismatched signature (e.g., the tuple “((188, 143, 143), ("rosybrown"))“, which has the order reversed), Rust will give me a compile-time error.

The full integration of tuples makes them a joy to use. For example, if a function returns a tuple, you can easily assign its constituent parts to disjoint variables, e.g.:

fn get_coord() -> (u32, u32) {
   (1, 2)
}

fn do_some_work() {
    let (x, y) = get_coord();
    /* x has the value 1, y has the value 2 */
}

Great stuff!

7. The integrated testing is terrific

One of my regrets on DTrace is that we didn’t start on the DTrace test suite at the same time we started the project. And even after we starting building it (too late, but blessedly before we shipped it), it still lived away from the source for several years. And even now, it’s a bit of a pain to run — you really need to know it’s there.

This represents everything that’s wrong with testing in C: because it requires bespoke machinery, too many people don’t bother — even when they know better! Viz.: in the original statemap implementation, there is zero testing code — and not because I don’t believe in it, but just because it was too much work for something relatively small. Yes, there are plenty of testing frameworks for C and C++, but in my experience, the integrated frameworks are too constrictive — and again, not worth it for a smaller project.

With the rise of test-driven development, many languages have taken a more integrated approach to testing. For example, Go has a rightfully lauded testing framework, Python has unittest, etc. Rust takes a highly integrated approach that combines the best of all worlds: test code lives alongside the code that it’s testing — but without having to make the code bend to a heavyweight framework. The workhorses here are conditional compilation and Cargo, which together make it so easy to write tests and run them that I found myself doing true test-driven development with statemaps — namely writing the tests as I develop the code.

8. The community is amazing

In my experience, the best communities are ones that are inclusive in their membership but resolute in their shared values. When communities aren’t inclusive, they stagnate, or rot (or worse); when communities don’t share values, they feud and fracture. This can be a very tricky balance, especially when so many open source projects start out as the work of a single individual: it’s very hard for a community not to reflect the idiosyncrasies of its founder. This is important because in the open source era, community is critical: one is selecting a community as much as one is selecting a technology, as each informs the future of the other. One factor that I value a bit less is strictly size: some of my favorite communities are small ones — and some of my least favorite are huge.

For purposes of a community, Rust has a luxury of clearly articulated, broadly shared values that are featured prominently and reiterated frequently. If you head to the Rust website this is the first sentence you’ll read:

Rust is a systems programming language that runs blazingly fast, prevents segfaults, and guarantees thread safety.

That gets right to it: it says that as a community, we value performance and robustness — and we believe that we shouldn’t have to choose between these two. (And we have seen that this isn’t mere rhetoric, as so many Rust decisions show that these values are truly the lodestar of the project.)

And with respect to inclusiveness, it is revealing that you will likely read that statement of values in your native tongue, as the Rust web page has been translated into thirteen languages. Just the fact that it has been translated into so many languages makes Rust nearly unique among its peers. But perhaps more interesting is where this globally inclusive view likely finds its roots: among the sites of its peers, only Ruby is similarly localized. Given that several prominent Rustaceans like Steve Klabnik and Carol Nichols came from the Ruby community, it would not be unreasonable to guess that they brought this globally inclusive view with them. This kind of inclusion is one that one sees again and again in the Rust community: different perspectives from different languages and different backgrounds. Those who come to Rust bring with them their experiences — good and bad — from the old country, and the result is a melting pot of ideas. This is an inclusiveness that runs deep: by welcoming such disparate perspectives into a community and then uniting them with shared values and a common purpose, Rust achieves a rich and productive heterogeneity of thought. That is, because the community agrees about the big things (namely, its fundamental values), it has room to constructively disagree (that is, achieve consensus) on the smaller ones.

Which isn’t to say this is easy! Check out Ashley Williams in the opening keynote from RustConf 2018 for how exhausting it can be to hash through these smaller differences in practice. Rust has taken a harder path than the “traditional” BDFL model, but it’s a qualitatively better one — and I believe that many of the things that I love about Rust are a reflection of (and a tribute to) its robust community.

9. The performance rips

Finally, we come to the last thing I discovered in my Rust odyssey — but in many ways, the most important one. As I described in an internal presentation, I had experienced some frustrations trying to implement in Rust the same structure I had had in C. So I mentally gave up on performance, resolving to just get something working first, and then optimize it later.

I did get it working, and was able to benchmark it, but to give some some context for the numbers, here is the time to generate a statemap in the old (slow) pure node.js implementation for a modest trace (229M, ~3.9M state transitions) on my 2.9 GHz Core i7 laptop:

% time ./statemap-js/bin/statemap ./pg-zfs.out > js.svg

real	1m23.092s
user	1m21.106s
sys	0m1.871s

This is bad — and larger input will cause it to just run out of memory. And here’s the version as reimplemented as a C/node.js hybrid:

% time ./statemap-c/bin/statemap ./pg-zfs.out > c.svg

real	0m11.800s
user	0m11.414s
sys	0m0.330s

This was (as designed) a 10X improvement in performance, and represents speed-of-light numbers in that this seems to be an optimal implementation. Because I had written my Rust naively (and my C carefully), my hope was that the Rust would be no more than 20% slower — but I was braced for pretty much anything. Or at least, I thought I was; I was actually genuinely taken aback by the results:

$ time ./statemap.rs/target/release/statemap ./pg-zfs.out > rs.svg
3943472 records processed, 24999 rectangles

real	0m8.072s
user	0m7.828s
sys	0m0.186s

Yes, you read that correctly: my naive Rust was ~32% faster than my carefully implemented C. This blew me away, and in the time since, I have spent some time on a real lab machine running SmartOS (where I have reproduced these results and been able to study them a bit). My findings are going to have to wait for another blog entry, but suffice it to say that despite executing a shockingly similar number of instructions, the Rust implementation has a different load/store mix (it is much more store-heavy than C) — and is much better behaved with respect to the cache. Given the degree that Rust passes by value, this makes some sense, but much more study is merited.

It’s also worth mentioning that there are some easy wins that will make the Rust implementation even faster: after I had publicized the fact that I had a Rust implementation of statemaps working, I was delighted when David Tolnay, the author of Serde, took the time to make some excellent suggestions for improvement. For a newcomer like me, it’s a great feeling to have someone with such deep expertise as David’s take an interest in helping me make my software perform even better — and it is revealing as to the core values of the community.

Rust’s shockingly good performance — and the community’s desire to make it even better — fundamentally changed my disposition towards it: instead of seeing Rust as a language to augment C and replace dynamic languages, I’m looking at it as a language to replace both C and dynamic languages in all but the very lowest layers of the stack. C — like assembly — will continue to have a very important place for me, but it’s hard to not see that place as getting much smaller relative to the barnstorming performance of Rust!

Beyond the first impressions

I wouldn’t want to imply that this is an exhaustive list of everything that I have fallen in love with about Rust. That list is much longer would include at least the ownership model; the trait system; Cargo; the type inference system. And I feel like I have just scratched the surface; I haven’t waded into known strengths of Rust like the FFI and the concurrency model! (Despite having written plenty of multithreaded code in my life, I haven’t so much as created a thread in Rust!)

Building a future

I can say with confidence that my future is in Rust. As I have spent my career doing OS kernel development, a natural question would be: do I intend to rewrite the OS kernel in Rust? In a word, no. To understand my reluctance, take some of my most recent experience: this blog entry was delayed because I needed to debug (and fix) a nasty problem with our implementation of the Linux ABI. As it turns out, Linux and SmartOS make slightly different guarantees with respect to the interaction of vfork and signals, and our code was fatally failing on a condition that should be impossible. Any old Unix hand (or quick study!) will tell you that vfork and signal disposition are each semantic superfund sites in their own right — and that their horrific (and ill-defined) confluence can only be unimaginably toxic. But the real problem is that actual software implicitly depends on these semantics — and any operating system that is going to want to run existing software will itself have to mimic them. You don’t want to write this code, because no one wants to write this code.

Now, one option (which I honor!) is to rewrite the OS from scratch, as if legacy applications essentially didn’t exist. While there is a tremendous amount of good that can come out of this (and it can find many use cases), it’s not a fit for me personally.

So while I may not want to rewrite the OS kernel in Rust, I do think that Rust is an excellent fit for much of the broader system. For example, at the recent OpenZFS Developers Summit, Matt Ahrens and I were noodling the notion of user-level components for ZFS in Rust. Specifically: zdb is badly in need of a rewrite — and Rust would make an excellent candidate for it. There are many such examples spread throughout ZFS and the broader the system, including a few in kernel. Might we want to have a device driver model that allows for Rust drivers? Maybe! (And certainly, it’s technically possible.) In any case, you can count on a lot more Rust from me and into the indefinite future — whether in the OS, near the OS, or above the OS.

Taking your own plunge

I wrote all of this up in part to not only explain why I took the plunge, but to encourage others to take their own. If you were as I was and are contemplating diving into Rust, a couple of pieces of advice, for whatever they’re worth:

  • I would recommend getting both The Rust Programming Language and Programming Rust. They are each excellent in their own right, and different enough to merit owning both. I also found it very valuable to have two different sources on subjects that were particularly thorny.
  • Understand ownership before you start to write code. The more you understand ownership in the abstract, the less you’ll have to learn at the merciless hands of compiler error messages.
  • Get in the habit of running rustc on short programs. Cargo is terrific, but I personally have found it very valuable to write short Rust programs to understand a particular idea — especially when you want to understand optional or new features of the compiler. (Roll on, non-lexical lifetimes!)
  • Be careful about porting something to Rust as a first project — or otherwise implementing something you’ve implemented before. Now, obviously, this is exactly what I did, and it can certainly be incredibly valuable to be able to compare an implementation in Rust to an implementation in another language — but it can also cut against you: the fact that I had implemented statemaps in C sent me down some paths that were right for C but wrong for Rust; I made much better progress when I rethought the implementation of my problem the way Rust wanted me to think about it.
  • Check out the New Rustacean podcast by Chris Krycho. I have really enjoyed Chris’s podcasts, and have been working my way through them when commuting or doing household chores. I particularly enjoyed his interview with Sean Griffen and his interview with Carol Nichols.
  • Check out rustlings. I learned about this a little too late for me; I wish I had known about it earlier! I did work through the Rust koans, which I enjoyed and would recommend for the first few hours with Rust.

I’m sure that there’s a bunch of stuff that I missed; if there’s a particular resource that you found useful when learning Rust, message me or leave a comment here and I’ll add it.

Let me close by offering a sincere thanks to those in the Rust community who have been working so long to develop such a terrific piece of software — and especially those who have worked so patiently to explain their work to us newcomers. You should be proud of what you’ve accomplished, both in terms of a revolutionary technology and a welcoming community — thank you for inspiring so many of us about what infrastructure software can become, and I look forward to many years of implementing in Rust!


Comments
Read the whole story
nocko
119 days ago
reply
Share this story
Delete
1 public comment
skorgu
120 days ago
reply
!

Gaming Gets More Inclusive With The Launch Of The Xbox Adaptive Controller

1 Comment and 2 Shares

Without a doubt, 2018 has been a hallmark year for inclusivity in gaming. From individual platforms and games introducing more features for gamers with accessibility needs to physical hardware like the Xbox Adaptive Controller, there has never before been such a high point for inclusivity in gaming. Available at Microsoft Stores and GameStop Online for $99.99, the first-of-its-kind Xbox Adaptive Controller will be available starting today, so even more gamers from around the world can engage with their friends and favorite gaming content on Xbox One and Windows 10.

The Xbox Adaptive Controller will be available starting today:

Purchase Xbox Adaptive Controller from GameStop

Purchase Xbox Adaptive Controller from Microsoft Store

The Xbox Adaptive Controller is a product that was ideated and pioneered with inclusivity at its heart. We iterated on and refined it through close partnership with gamers with limited mobility and fan feedback, as well as guidance and creativity from accessibility experts, advocates and partners such as The AbleGamers Charity, The Cerebral Palsy FoundationCraig Hospital, Special Effect and Warfighter Engaged. Even the accessible packaging the Xbox Adaptive Controller arrives in was an entirely new approach to redefining success in product packaging—directly informed and guided by gamers with limited mobility. It’s truly the collaboration and teamwork from these individuals and groups who helped bring the Xbox Adaptive Controller to gamers around the world. And gaming, everywhere, becomes greater because of that collaborative spirit.

Xbox Adaptive Controller

To the gamers and industry professionals around the world who shared their thoughts, feelings and feedback on either the Xbox Adaptive Controller itself or the accessible packaging it ships in—thank you. From gamers like Mike Luckett, a combat veteran based in the US who tested and shared feedback on the controller through the beta program, to gamers in the UK who kindly invited us into their homes and shared which iteration of the accessible packaging they liked most—this day of launch is a thanks to all your contributions. On behalf of gamers everywhere, we share our sincere thanks.

While the response from communities, gamers and press when we introduced the controller in May was remarkable, the true impact the Xbox Adaptive Controller has had with gamers becomes clearer when attending events like E3 in Los Angeles in June, wearing an “Xbox Adaptive Controller” t-shirt. Walking the show floor to run a simple errand, you become bombarded with smiles, greetings and high-fives—shared by gamers of all types—embracing and furthering the fondness of supporting inclusivity in gaming. It’s a powerful sentiment of appreciation for inclusivity, and we’re humbled by the reception.

Xbox Adaptive Controller

Beyond the humbling praise from the gaming industry, the Xbox Adaptive Controller has been equally recognized for its innovative approach to inclusive design in gaming. In fact, just today it was announced that the V&A, the world’s leading museum of art, design and performance, has acquired the controller as part of its Rapid Response Collecting program, which collects contemporary objects reflecting major moments in recent history that touch the world of design, technology and manufacturing. It’s an honor and achievement we did not set out to accomplish but are nonetheless moved by the recognition of the team’s passionate work invested in the Xbox Adaptive Controller, helping it stand out as a truly first of its kind product—in gaming and beyond.

Let today be a celebration of inclusivity in gaming—regardless of your platform, community or game of choice. Whether you’re a gamer using the Xbox Adaptive Controller for the first time or new to gaming, welcome to the Xbox family! Inclusivity starts with the notion of empowering everyone to have more fun.  That means making our products usable by everyone, to welcome everyone, and to create a safe environment for everyone.

If you’re looking for more information on the Xbox Adaptive Controller, peripherals available today to configure it just for your use, or tips on how to get setup, we’ve got you covered. Learn more about peripherals from our hardware partners such as Logitech, RAM and PDP, used to customize your Xbox Adaptive Controller configuration, hereVisit this page to learn more about using Copilot with the Xbox Adaptive Controller. And here is some general product information to help you learn more about the Xbox Adaptive Controller. Thanks again for joining us on this incredible journey of inclusivity; see you online!

Read the whole story
nocko
134 days ago
reply
Share this story
Delete
1 public comment
DMack
135 days ago
reply
all those 3.5mm jacks and no notch
Victoria, BC

McCain

1 Comment and 3 Shares
TUCSON, AZ – MARCH 26: Senator John McCain and former Alaska state Governor Sarah Palin campaign at Pima County Fairgrounds, March 26, 2010 in Tucson, Arizona. (Photo by Darren Hauck/Getty Images)

John McSame has died.

Any decent obituary of John McCain has to be as much about the media fawning over him as about the man himself. This was a not a good man and yet no one was more lionized by the Beltway media establishment in the entire recent history of American politics; possibly no one since John F. Kennedy has received more fawning coverage and much of that for JFK was post-1963. Why McCain received this adoration may remain a mystery to historians for years because it’s completely nonsensical based on the man’s actual career. And yet, the media could never get enough of him. McCain claimed to have the most Meet the Press appearances all-time over its long run, and it’s hard to imagine that he doesn’t, although in 2007, NBC said it was Bob Dole, but that was before another decade of weekly McCain appearances.

McCain was born in the Canal Zone in 1936 to a Naval Air officer. A military brat, the family traveled around all the time and finally McCain went to high school in the DC area. He entered the Naval Academy, as his father and grandfather had done. He was largely terrible, graduating 894th in a class of 899. McCain was a partier and a ladies’ man who took the social life more seriously than his early air training. But he managed to become a competent, if risk-taking pilot. He got married in 1965 and then went to Vietnam, where he asked for a combat assignment. On his 23rd mission, he was shot down over North Vietnam and nearly died, first from his injuries and landing in water, and then from being beaten after the villagers rescued him. After all, he was raining fire on them and killing them left and right. The reaction of the villagers is entirely understandable.

When the North Vietnamese discovered that McCain’s father was an admiral, he became a showpiece for them and he was treated a little better and received medical care for his wounds, although he certainly received his share of brutality after that too. He was moved from prison to prison, including two years of solitary confinement than began in March 1968. Torture started to break him. He considered suicide but was interrupted while preparing for it. He signed a bogus confession, and was later ashamed, but of course no one can really stand up to torture. He finally was released from prison in 1973, after five hellish years. Whether all this makes him a “hero” or not is a question I guess you will have to decide. I certainly don’t question the man’s toughness or personal bravery. I don’t find the term hero particularly useful and I’m unclear how this qualifies someone for the title as opposed to, I don’t know, organizing people to lift themselves out of poverty, but this is a battle I recognize I will never win. In any case, I don’t think his history should have meant anything when it came to political respectability but of course it did.

McCain returned to the U.S. and went back to his high-partying ways, making up for lost time. He had affair after affair, destroying his marriage to the woman who had waited all those years for him, a woman who had suffered through a severe car crash in the meantime. But McCain was now a celebrity and had huge ambition to take advantage of that. He entered the political world in 1977 when he became the Navy’s liaison to the Senate, introducing him to basically everybody. Still married, he began dating Cindy Hensley, the daughter of a very wealthy beer distributor. He pressured his first wife into a divorce, married Cindy, and then used her money to finance his burgeoning political career. How sure was this political career by this time? His groomsmen were Gary Hart and William Cohen. He resigned from the Navy in 1981 and prepared to run for office. Cindy’s dad hired McCain into his company and that put him firmly in the Arizona elite, where he got to know such useful people as the financier Charles Keating. Cindy funded his 1982 entrance into electoral politics, when he won an election to Congress from Arizona-01.

When McCain entered Congress, he was really nothing more than a bog-standard Republican. For a guy who made his real reputation on foreign policy, his foreign policy stances were terrible. He embraced right-wing dictators in Latin America. He traveled to Chile to meet with Augusto Pinochet. When Reagan illegally funded the Contras to overthrow the Sandinista Revolution, McCain loved it. McCain also showed his bipartisan maverickocity by taking a really brave stand—opposing making Martin Luther King Day a national holiday! Why, I haven’t seen such Republican leadership since Dick Cheney did all he could to help apartheid South Africa! McCain later said he regretted this—in hindsight, I believe him. But it doesn’t really matter what your views are when something becomes so normalized that it is universal. This is like saying one opposes slavery in 2018. Who doesn’t?!? What matters is what you did when it was time to make the decision? And as he would through most, albeit not all of his career, he failed miserably when the rubber met the road.

Of course, none of this hurt him with right-wing Arizona voters and he won election to the Senate in 1986. He continued with many of his well-defined interests. His noted love of gambling and close ties with the gambling industry led him to sponsor the Indian Gaming Regulatory Act of 1988. He took a seat on the Armed Service Committee, using it to make his love of American militarism his top policy priority. He was a big supporter of Gramm-Rudman, forcing automatic budget cuts when budget deficits occurred, deficits that were likely given McCain’s love of big things that go boom and cost lots of money. He made a positive impression on the media early on, leading to speculation that George Bush could name his VP candidate in 1988. Let’s face it—that would have been a much better choice for Bush than Dan Quayle!

What got John McCain his first major spotlight in the Senate? Being a member of the Keating Five. His Arizona buddy Keating had given McCain well over $100,000 in campaign contributions, had given him free flights, and all the other quasi-legal or slightly illegal perks of political influence. So when the federal government came after Keating for his crimes, he sought to cash in with McCain and the other politicians he had purchased. With that kind of corruption, you can see why the media fawned over him! Basically, McCain, Alan Cranston, John Glenn, Dennis DeConcini, and Donald Riegle intervened to protect the interests of Charles Keating in the Savings and Loan scandal, getting the Federal Home Loan Bank Board to back off its investigation. Keating had contributed $1.3 million to the five senators and this came out when the company collapsed in 1989, defrauding 23,000 bondholders. But hey, McCain actually did work with Democrats on that one! The Phoenix New Times’ Tom Fitzpatrick, in 1989:

You’re John McCain, a fallen hero who wanted to become president so desperately that you sold yourself to Charlie Keating, the wealthy con man who bears such an incredible resemblance to The Joker.

Obviously, Keating thought you could make it to the White House, too.

He poured $112,000 into your political campaigns. He became your friend. He threw fund raisers in your honor. He even made a sweet shopping-center investment deal for your wife, Cindy. Your father-in-law, Jim Hensley, was cut in on the deal, too.

Nothing was too good for you. Why not? Keating saw you as a prime investment that would pay off in the future

So he flew you and your family around the country in his private jets. Time after time, he put you up for serene, private vacations at his vast, palatial spa in the Bahamas. All of this was so grand. You were protected from what Thomas Hardy refers to as “the madding crowd.” It was almost as though you were already staying at a presidential retreat.

Like the old song, that now seems “Long ago and far away.”

Since Keating’s collapse, you find yourself doing obscene things to save yourself from the Senate Ethics Committee’s investigation. As a matter of course, you engage in backbiting behavior that will turn you into an outcast in the Senate if you do survive.

Ouch.

It’s amazing that all the media lauding of McCain over the past decades totally forgets his corruption! But the people of Arizona didn’t care either and he won reelection in 1992 with 56% of the vote.

Here’s the key thing to know about John McCain—he really loved killing brown people around the world to show other nations how tough the United States is. Little defines him more than that. Every time there was a crisis with another nation, one usually created by American militarism—every time!—he would go on TV and massively exaggerate its importance to demonstrate the need for Americans to show toughness and, of course, bomb people. And sure, giving manly campaign speeches on the crisis in Georgia in 2008 that lifted from Wikipedia might have shown his utter intellectual vacuity, but he’s so tough and mavericky!

And then there is his class that one can only love. I mean, who but a true hero to journalists would tell the following joke, as McCain did to Republican funders in the late 1990s:

“Do you know why Chelsea Clinton is so ugly?”
“Because Janet Reno is her father.”

Ha ha ha. What mavericktude! Making fun of the looks of both a teenage girl and a pioneering Cabinet official. This is a good summary of that joke:

“The remark packed into its 15 words several layers of misogyny. It disparaged the looks of Chelsea, then 18 and barely out of high school; it portrayed Reno as a man at a time when she was serving as the first female US attorney general; and it implied that Hillary Clinton was engaged in a lesbian affair while the Monica Lewinsky scandal was blazing. Not bad going, Senator McCain.”

God, what a great American! No wonder we laud him as a hero!

Now, look, McCain wasn’t a legendarily bad senator, particularly in comparison with other early twenty-first century Republicans. On some issues, he did good things. He helped normalize relations with Vietnam and was a critical voice on this issue when it was still sensitive since the diehard POW-MIA people wanted to fight the war forever, determined that evil Asian commies were still holding our boys in torturous cells, imagining Christopher Walken in The Deer Hunter as a daily occurrence in the 1980s. And while his wife’s fortune is what propelled him into office, he did recognize that outside campaign funding was a problem in our political system. McCain-Feingold is not my favorite piece of legislation but it moved the ball toward a goal of better democracy and was probably the best bill that could be passed at the time, or since. Of course, at the same time, he was receiving contributions from the same companies he was supposed to be regulating as head of the Senate Commerce Committee. He wanted to regulate the tobacco industry more heavily, which is hardly controversial, but of course was at the time. So, fine. McCain is not Jesse Helms or James Inhofe or Ted Cruz.

McCain also very badly wanted to be president. So he played up to the Republican base on 90 percent of issues. And of course he was horrible on Bill Clinton. I strongly dislike Bill Clinton for many reasons, but the impeachment proceedings were a direct attack on American democracy, sheer political partisanship for short-term gain. Now, a real political maverick would have noted that even though the president who had been impeached was not a member of my political party, this is a bunch of nonsense that breaks the norms McCain gave lip service to respecting. Naturally, McCain did the opposite and voted for conviction. He followed up this red meat to the Republican base with a book, the sure sign that a politician is thinking about running for president. McCain decided to take on George W. Bush for the Republican nomination in 2000.

Now, even though I am not painting a particularly positive picture of McCain, he was still suspect to the Republican elite because of his very occasional actions working with Democrats. The Straight Talk Express was mostly just pandering to a media that already saw McCain as their Republican daddy who would save us from more Democrats where the men act like women and the women act like men, to borrow from American sage Maureen Dowd. And so, Bush and his allies decided to undercut McCain in the dirtiest way they could get away with.

After McCain won New Hampshire, Karl Rove and his ratfuckers went low on McCain, actually accusing him of fathering a black child out of wedlock, a reference to his adopted daughter from Bangladesh. Of course, this worked like a charm in South Carolina, Strom Thurmond having done this very thing notwithstanding. This was basically the end of the McCain campaign, with Bush winning big among evangelicals, those so-called values voters, who vote in favor of the most racist candidate possible and who love noted moral titan Donald Trump today, just as Baby Jesus would do. McCain won a few more states, but after Super Tuesday, was through.

There was some thought that McCain would have his revenge on Bush, especially after Jim Jeffords left the Republican Party and gave the Senate back to the Democrats. And there was enough of the asshole in McCain to believe this was possible. But in the end, even if Bush and friends had screwed him over personally, McCain is a genuine right-winger and liked basically all of Bush’s policies. So why would he have done this? Plus, he still wanted to be president really bad and that would kill his chances. So more or less, McCain just became a bog-standard Republican again, like he almost always was.

So McCain spent the Bush years cheerleading for the Iraq War except for the torture, which wouldn’t stop him from voting for the war but made for good soundbites to sound mavericky. He said publicly that the U.S. would be greeted as liberators by the Iraqis, which if that ever was true, didn’t last more than a New York minute. His main concern with the Iraq War in the early years was that we didn’t have enough troops there, publicly criticizing Donald Rumsfeld for believing we needed relatively few. And when the war did go disastrously for the United States, McCain was the main force in Congress behind the 2007 troop surge, which had some military effectiveness, but also made McCain completely unable to separate himself from an unpopular and pointless war at the moment he was preparing another run for the White House.

And he spent those years voting for basically every Bush domestic policy proposed. In his free time, he was on TV over and over and over again, maintaining his role as Big Media Hero. For example, the Beltway media loved McCain’s role in the Gang of 14, the bipartisan senators who crafted a compromise allowing Republicans to fill the judiciary with terrible conservative judges. But that was McCain’s game; work with gullible Democrats (of which there were so many in the 2000s) to fashion a policy agreement that allowed someone like Janice Rogers Brown through without a filibuster, all with the end game of then preserving the filibuster for Republicans when Democrats became president, which of course they used to unprecedented extremes. As for the Supreme Court, he said that John Roberts and Sam Alito were “two of the finest justices ever appointed to the United States Supreme Court.” On other policies, again, just a Republican seeking to move resources to the rich. Tax cuts for the rich? You bet!

On the other hand, McCain deserves some credit for not being an anti-immigration extremist. He pressed for comprehensive immigration reform, a project supported by people from George W. Bush to Ted Kennedy, with whom he cosponsored legislation. But there was no way that Republican legislators were going to seriously look to pass this bill and McCain wasn’t going to buck them enough to actually do something about it.

McCain’s 2008 presidential run was hardly predestined for success. An increasingly radicalized Republican base really hated that he wanted a reasonable solution for immigrants that did not deport them. He struggled with fundraising early on. But so did everyone else. The field was a mess, with Republicans deeply unpopular, a sadly impermanent state. Mike Huckabee was a clown who could win in Iowa, but what whackadoodle can’t win over Iowa Republicans? These are people who vote repeatedly for actual Nazi Steve King. But when he beat Mitt Romney in New Hampshire and Huckabee in South Carolina, it was pretty much over. Mr. Noun/Verb/9-11 completely failed and so did Mr. Reverse Mortgage Fred Thompson.

Who did Mr. Maverick announce as his vice-presidential candidate? Why, none other than Sarah Palin! How bipartisan, naming an ignoramus and quasi-fascist whose sole policy objective was making the libs cry! It’s worth noting how much McCain damaged the nation through this choice. Sarah Palin was an irresponsible, incompetent clown. But because she delivered red meat to the base and of course the racists who vote Republican loved it, her handlers and supporters realized that she was the ticket to the future. This wasn’t all on her; it’s not as if George Bush’s naming of Dan Quayle to the ticket in 1988 was all that different, even if he was a generation of wingnut before Palin. But in the aftermath, Palin-esque politicians—more competent and less self-involved—would deliver Republican victories by making white people feel good about expressing their resentments in the most crude way possible. This of course paved the way for Donald Trump, someone even less competent and more self-involved than Palin, but one even better at white supremacy. Thanks John.

In fact, it’s amazing that his entire 2008 campaign didn’t permanently kill his reputation. Even outside of Palin, McCain said nasty thing after nasty thing. Sure, he might have joked about pimping out his wife to bikers at the Sturgis rally, but that’s just boys being boys, amiright? There was his constant invocation of Joe the Plumber, an early rendition of the Trump campaign if there ever was one. McCain repeatedly tried to taint Obama by calling his policies “socialist.” By the end of the campaign, the entire world was relieved. Europeans especially had heard of the reputation of McCain, but ended up calling him John McSame, as they realized he was a very nasty old man who supported nearly all the failed policies of George W. Bush. Moreover, it’s not like McCain softened his positions. He still supported a constitutional amendment against abortion. He either wouldn’t or just couldn’t answer basic questions about sex education and whether contraception stops the spread of HIV. He campaigned on extending the Bush tax cuts through reducing Social Security but wouldn’t specify how that would work. Now that’s the kind of bipartisanship that excites the Beltway! He publicly said he would consult with Sam Brownback on judicial appointments, ensuring that whoever he named would be, well, someone like Neil Gorsuch.

Then there was the time he couldn’t remember how many houses he owned, or should I say his extremely wealthy wife. Turned out the answer was more than 10. Who can keep count! And really, that was a great answer in the middle of a housing crisis. I hadn’t seen a Republican presidential candidate as in touch with everyday people since the time George HW Bush was amazed at a grocery store checkout scanner.

Of course, nothing McCain said was enough for the rabid Republican base, who was dying for someone like Donald Trump. McCain rallies became openly racist and Islamophobic, with Republicans demanding he attack Obama in the most disgusting way possible. McCain did at least resist the worst of this.

Failing pretty miserably in the polls, McCain tried to get Barack Obama to suspend the campaign on September 24 so they could return to the Senate and work on the financial crisis. Not being an complete idiot, Obama refused. And as November rolled around, it was clear that McCain would get blown out of the water, which he did.

And here’s the rub—nothing about McCain’s lionization by the media before, during, or after his presidential campaign made any sense at all. With very few exceptions, he was just a standard right-wing Republican. He was nothing but a Goldwater follower who cared about foreign policy a lot. Basically, McCain’s relationship to the media comes out of their deep desire for a Republican Daddy who will protect their financial assets, make America look real tough on the international stage, provide lip service to international standards of behavior, and at least not sound like a maniac on social policy, despite his actual voting record. This is the ideal for our Beltway media and it’s disgusting.

McCain returned to the Senate in 2009 and played the exact same role as he had before. It’s amazing that he wasn’t subjected to the rule that losing presidential candidates must disappear after the election. Or wait, is that law only applied to female presidential candidates? Hmmm… Anyway, McCain remained the same bog standard hack he always was, talking about pork in the federal budget by bringing up hi-larious issues such as the government funding beaver management programs—as if that wasn’t an actual issue land managers face. There was a brief moment when he and Obama had a good relationship, but that ended as soon as Mitch McConnell decided the Republican strategy to losing power would be to destroy as many norms of American politics in the most cynical method he could. McCain joined this with aplomb, despite his mysteriously rehabbed reputation with the media as a bipartisan leader, having that reputation seriously damaged for about a month at the end of the election. McCain ripped Obama for pulling out of plans to build a missile defense complex in Poland that was unnecessary. Despite his previous support for doing something about climate change, he now refused to engage in any constructive legislation to address it. Not with Obama in the White House he wouldn’t! He led the filibuster to stop the repeal of “Don’t Ask, Don’t Tell.” When it finally was repealed, he said that it was “a very sad day” that would undermine the military. And it’s true, how has the U.S. military functioned since what with all the gay sex? McCain hated the Affordable Care Act when it was passed, regardless of his later vote to save it. He once sponsored the DREAM Act, but now voted against it. I could go on. If McCain had been a maverick before, which he hadn’t, he was a full-fledged hack now.

Yes, McCain had some issues where he had respectable standards of decency. He consistently opposed torture, but then did absolutely nothing to object to pro-torture politicians outside of this narrow zone. He might vote against a particular nominee who had been directly involved in torture, but then would go on talk show after talk show defending the people who put said person there and the policies that led to the torture and would lead to more. The McCain-Feingold campaign finance bill was a good one, but again, once that began to be chipped away, McCain did nothing but support the very people responsible for it. After Benghazi, McCain was on the front lines accusing Hillary Clinton of awful things that she was not responsible for, calling it worse than Watergate and ensuring Susan Rice not succeed Hillary as Secretary of State. Of course, McCain was all about intensifying the war in Syria with the massive army of our supposed allies. What could have gone wrong! He would occasionally return to some bipartisan actions, such as his support for comprehensive immigration reform, but in the end, he almost always put the Republican Party over the nation’s needs. When he could have really stood up against Donald Trump, a man who had directly insulted him, he did not. He voted for the judges, voted for Jefferson Beauregard Sessions III, voted for almost the entire Trump/Ryan/McConnell policy agenda.

Overall, the man had very few principles that trumped his extreme partisanship. Take for example his position on Supreme Court justices. For much of his career, he voted for whoever a president nominated, whether it was Robert Bork or Ruth Bader Ginsburg. If you believe a president has the right to name whoever they want to the Court, then live by it. OK. But at the end, when control over the Court was in the balance and Mitch McConnell was willing to destroy two centuries of norms in order to advance his radical right-wing agenda, McCain completely changed course! First, he voted against Sonia Sotomayor. Then there was no way he would vote on Merrick Garland. And after he helped McConnell prevent Obama from filling that sea, he stated that Republicans would block all Supreme Court nominations Hillary Clinton would have made, saying “I promise you that we will be united against any Supreme Court nominee that Hillary Clinton, if she were president, would put up.” Now that’s some independent bipartisan leadership! And look, I am more than happy to give McCain a bit of credit for voting against the repeal of the ACA in 2017. He certainly doesn’t deserve more credit than anyone else who voted against the bill, but fine. Good for you. For once you were no less terrible than the worst Democrat in the Senate.

On very rare occasions, particularly in the 2008-10 period, reporters would realize how awful McCain actually was, write a column claiming they were reconsidering the man, and then go back to lauding him soon after. Even before he died, McCain became the object by which reporters could pursue their wet dreams in obituaries. Dana Milbank’s may be the most sycophantic, but this exchange between Bret Stephens and Gail Collins really isn’t any better. CNN decided it was time to publish articles by Asian-Americans forgiving McCain for his grotesque racism; after all, what is the feelings of marginalization due to racial discrimination compared to the veneration of War Hero Maverick McCain?

Who will our lovely media turn their desperate attention to now? Is Michael Bloomberg the only man who can save us? Is Lindsey Graham the Republican Daddy we need? A generation of Meet the Press appearances demands to know!

John McCain is survived by, among others, his wife Cindy and his daughter Meghan, who has recently spent her time hyperventilating about inheritance taxes and marrying an actual fascist.

FacebookTwitterGoogle+Share

Read the whole story
nocko
144 days ago
reply
Share this story
Delete
1 public comment
fxer
144 days ago
reply
"he was shot down over North Vietnam and nearly died, first from his injuries and landing in water, and then from being beaten after the villagers rescued him. After all, he was raining fire on them and killing them left and right. The reaction of the villagers is entirely understandable."
Bend, Oregon
Next Page of Stories