In 2018, my wife and i concluded we wouldn’t Have children. We tried. After two miscarriages, some hardcore consultations, emotional whiplash, and complex decision-making, we committed to childlessness. In fact, hallelujah to the procreation gods who decided to spare us. We’ll get a pet instead. That was our retaliatory thinking.
Back then, we were living in two places, not apart but together. My wife was a freelance editor and could carry her work wherever. I had taken a faculty job in southern Ohio—a hellish landscape dotted with strip malls, outlets, and dessert factories. We stayed down there for months at a time, then returned to our homeland in northern Michigan over holidays and summers. In Michigan, we had friends, space, weather befitting our impulses, good food, and peace. In Ohio, we had work. My job was temporary by design, a visiting professorship, so we kept our small ranch home in Michigan and managed to buy a tiny condo in Ohio, which we tolerated from August to December, then February to May.
I’m aware of the class implications. We had two homes while others had none. We had two kitchens, four bathrooms, five bedrooms, two unfinished basements, and several closets spread over two states while a half million Americans had no place at all—and plenty more lived (still live) in hostile, cramped, or squalid conditions. But each step leading to our dual landownership was fraught with incredulity and defeat. We didn’t want to buy a condo. The idea had no allure, nor did renting a gooey apartment across from the Panty Shanty or other booze-soaked animal houses in that college town. We wanted a quiet place to work—no frills, no impressive views, no glitz. But the market was over-picked. We saw various apartments and rental houses. At the end of a long day, we pulled into Century 21 and asked if any small structure in the zip code could be bought. We lucked out because a two-bedroom condo had just come on the market. Even better, it sat one block outside the party zone, which locals called “the mile square.” An agent drove us there. Past the threshold, we all stopped and scanned. It had maple floors, only partly gnarled, granite counters, and new appliances. It didn’t stink of mold or shame. He turned to us. “This,” he said in pure revelation, “is the nicest condo in town.” We said yes.
The housing search was our introduction to southern Ohio. Granted, I’d grown up in Ohio—a small town wedged into the northwestern corner—so I had a sense of Buckeye tendencies. But southern Ohio is a different planet, and that university town a weird moon fuming with tired strife. The area felt exhausted, the entire biosphere panting in some way, winded by wealth and semi-recklessness. All the gleeful screaming, so many Maseratis, Mercedeses, and Ferraris parked along the streets, so many red plastic cups piled in front lawns weighed on the posture of smalltown life. In socio-political terms, it was a demographic collision: disparate classes, ethnicities, and generations—people who want something utterly different from each moment—crawling around together. It wasn’t inclusive or diverse. It was entanglement. Invective hung in the air. Imagine all political conventions spilling into one stadium, everyone stomping on one another’s fallen banners and confetti, nobody happy to encounter different persuasions or anyone at all.
On our first trip to the regional foodstuff purchasing complex, we heard someone bellow, “I hate Koreans!” We turned to see a middle- aged white woman in a flowered dress next to a carful of Asian students who were likely Chinese, not Korean, given the university’s outreach programs. The woman’s declaration was both hateful and mournful. It was loud enough to carry across the lot, but she seemed soul-injured, hurt by her own rage. If those young Chinese students would’ve run over and given her a hug, she’d have likely wept in their arms. That’s how it sounded. She wanted attention, hope, tenderness, something. And what about us? Did we hate it down there? Sure. Almost instantly and then more as the weeks rolled, we hated it with fervor, with clenched fists, with all the gusto of a punk rock anthem, but we also felt terrible for the town, for permanent residents, for minutes and seconds spent there, for time itself that must rub against such a place.
Were there kind and engaging humans inhabiting the area? Of course. We even met a few. And did I have some especially lucid colleagues? Certainly. But the cultural climate seethed beyond any furtive huddles. And despite scrolls of beneficent rhetoric, the university itself dominated the town and controlled all prevailing winds. Among various dismissive acts, the institution made an art of demoralizing new hires. Each spring, HR distributed letters to visiting faculty saying, in less polite terms, thanks for teaching here, but you’re officially done, so please leave. It was the notorious get out letter. When I first received it, I assumed I’d done something epically wrong. I hadn’t. The letter voiced the institution’s unstated but obvious mission to promote occupational discomfort, even mass anxiety. Once I was permanent faculty, the spring letter went something like thanks for your service, but please remind us in multiple ways (with charts, bullet points, and easily consumable globs of information) why you should continue siphoning money from our school.
Given everything, why stay in such a place? Why beat your head against such institutional walls? Why not pack up and leave? First, we stayed because we needed an insurance-granting job between us. Like millions of middle-class Americans, we were clinging to medical benefits folded into rotten employment. We stayed, also, because we hadn’t yet embraced alternatives, and during those eight years, I worked harder than my body wanted. For the first time in my life—and not since—I had high blood pressure. I developed chronic canker sores. I lost weight. My migraine prescription was always nearby. Still, we stayed. As it turned out, the work was sometimes better than okay. While underpaid (starting at $40K), I met enough savvy and big-hearted college students to make me believe that our civilization might not be doomed. This runs contrary to age-appropriate musing, but I’ll declare: at least one forthcoming generation includes a significant swath of focused, curious, and open- minded creatures. Many of my students embraced sophisticated concepts, tried out new terms, probed the layers of their own reasoning, and taught me things I did not know. They laughed at themselves, tried hard to shine, and even confessed their own mistakes. They were not yet victim to cable news or the petty self-righteousness of academia. Many were razor-sharp, widely read, highly skilled, and not-at-all haughty about it. In short, I appreciated my earnest students and told them, more than once—with the door closed so no faculty or administrators could hear—to hurry and take the societal steering wheel before it’s too late.
There is more to say about the town, region, campus, so many famously dismissive staff members—people who used facial expression as a weapon to ward off encounters. Sometimes, I would fight back and say hello to those whom I knew hated a greeting. It was cruel of me, but I was beginning to adopt the prevailing ethos. Meanwhile, my wife and I spent evenings in our condo strategizing. Could we stay another semester, another year, another minute?
One spring morning, I looked out our window at a patch of quasi- wilderness behind the condo and saw a feral cat, one of many, batting something into the air. I wentout, chaseditoff, andfoundthree massacred baby rabbits surrounded by a smear of others. Of course, cats will destroy wantonly, but this was a monument of doom, a lunatic’s ghastly dream. I did what I could. In digging a hole by a small bed of ivy, I noticed an adult rabbit watching. A parent? A relative or bystander? My wife noted the date. It was Mother’s Day.
It’s hard not to associate—not to make a basic epistemic move, the one Aristotle pinned as fundamental to human intelligence—and connect the murder scene with broader experience. We woke to dead rabbits shortly after two miscarriages, not long after I’d lost my father to kidney cancer, not long after my wife’s stepmother died from liver cancer, and in a professional climate of low-grade hostility. Maybe I’ve mashed all grievous events into one dense moment. Maybe I’ve taken synecdoche too far. Maybe the process of happily absorbing professional toxins while containing personal tragedy creates its own logic, but when I think of working in southern Ohio—of campus, of department politics, of the university’s well-appointed cruelty, I think of that cat.
*
While there are countless definitions and compelling metaphors for ideology, one element seems consistent: a mass of notions operating together without our conscious permission. Like the individual players on a winning team, these notions become victorious once they form a singular entity. But unlike the winning team, ideology tends toward quiet. Except for political ideology, which loves attention, most ideologies prefer camouflage: to hang out unrecognized, to crash the wedding and blend in like a second cousin. For this reason, they persist. With a few exceptions (e.g. fascism), they don’t go down in flames or work hard to make dramatic comebacks (also fascism).
There are no surefire ways to sense an ideology, to know for sure if one is hanging around in one’s consciousness. One simplistic test involves spelling. Does the thinking fit into an –ism? Do the operating notions belong to professionalism, environmentalism, feminism, pluralism, capitalism, Marxism, consumerism, Christian fundamentalism? If so, there’s an ideology at work. Of course, this test doesn’t encompass all. Some ideologies have no predictable suffix. They must be perceived and named. Enter famed sociologist Max Weber.
At the start of the 20th century, Weber identified the Protestant work ethic. It’s not simply Protestantism, the broad politico-theological vision borne of Martin Luther. It’s a huge constellation of notions related to honor, fate, obligation, submission, self-sacrifice, and monetary reward. It’s an ideology that, according to Weber and most sociologists in his wake, helped capitalism to get anchored in America. In other words, the Protestant work ethic is a multi-headed ideological creature that has shaped millions of lives. It includes beliefs in a shared currency, a system for distributing that currency according to labor, and a delicate connection between a deity and the rightness of labor. These beliefs converge in history to form a river that washes over us each morning and prompts us to work. To put it more directly, the Protestant work ethic made us all get alarm clocks. Consider a few of its most functional notions:
Adults work at least five days per week. Work happens from morning to evening. Nighttime is restoration for the next day’s work. Getting up early for work is responsible. Staying up late is irresponsible. Working all day means full participation in society. Good people fully participate in society. Society rewards good people. Rewards are worth the hardships of work. Rewards matter more than hardships. Good workers endure all hardships.
I grew up in a working-class home. My father devoted his days to a local factory and my mother to a chain retail store. In junior high, I had a paper route. In high school, I worked at a pizza restaurant. In the summer between sophomore and junior years, I worked on a ranch in Texas. In college, I worked 40-50 hours per week in a rock band. In the summer, with the rock world requiring my presence at night, I worked 7:00 – 4:00 for the city municipal department and slept during lunch. In graduate school, I made sandwiches at a local bistro. After graduate school, I accepted my first full-time professional position at a community college in Toledo, Ohio. The position was a notorious 5/5, which means five courses fall and spring semester, the kind of position widely characterized by college faculty as “in the trenches.” With 100 writing students each semester, instructors spend all evening, every evening, responding to student work. Weekends? Same. My second full-time position, at another community college, was a blissful 4/4. Still, I worked well over 50 hours per week on academic duties and then went into the wee hours for my own writerly efforts.
This is all to say I was born into work. I like it. I’ve never been good at working smart—a bourgeois euphemism for dodging work. In every job, there’s a level of intensity required to take on the task, and there are several levels of non-intensity that pass as work, especially to people surveilling from a safe distance. Throughout my teaching career, I had administrators tell me that I’d get “quicker” at responding to student writing, that my increasing prowess would manifest as efficiency. Over time, I did get better at responding, much better, which means I took longer. It’s easy, and a little gratifying, to characterize those administrators as naïve apologists for their own mismanagement, but I can also attribute their counsel to obedience. They were simply adhering to the discourses of corporatism.
After a few years in southern Ohio, my duties increased with more titles and more students. Thanks to COVID and the ongoing drive to save the institution’s money by increasing administrator pay and faculty workload (always both), I was assigned more courses and more responsibilities than I could count. I didn’t read anything but emails and student work for more than a year. In the winter of 2020, I suddenly couldn’t feel my fingers. I called my doctor and tried to explain: I really don’t think it’s my heart. But a medical office can’t take a call from a 50-year- old with numb fingers and say anything beyond get to the ER, immediately. It wasn’t a heart attack. While grateful, I suspected something else, but not necessarily this: nerve damage from too much gargoyle- like posturing over a computer, years of it. The diagnosis prompted an admission: I was totally burned out. A few months later, when my wife accepted a fully remote position, one that paid more than twice my salary, I made an extravagant, completely un-working-class move: I quit. I didn’t tell anyone to take that job and shove it (because I admired my department chair at the time), but I certainly felt the need to abandon my post. I wasn’t alone.
In 2021, over 40 million Americans quit. They’d had enough. In December alone, 4.3 million Americans left jobs across all sectors. Plenty did so because the job itself lost its shine, because it never had shine to begin with, because the effort didn’t match the pay, because customers had slowly sucked away all hope and goodness, or because the boss was toxic. Also, COVID had come along to remind us that working among the public is a perilous enterprise. In that rotten era, American consumers did a fine job of abusing clerks, retailers, and front-of-house staff. Given mandates and ongoing efforts to manage a novel virus, employees suffered threats, beatings, shootings, and murder. In short, work life in America had been officially broken. Ironically, the Great Resignation happened in a country committed to over-work. Historians, I suspect, will see those years, these years, as an era of rupture, of epic overturning. And while COVID played a role, other forces have held sway. Beneath the agonistic rhetoric of plagues, the finger pointing and shared incredulity, the masking and anti-masking, something else was afoot.
Since World War II, professionalization has been the single-most audible drumbeat in the workplace. Nearly every identifiable field outside of manufacturing has developed a national or regional organization devoted to increased autonomy, funding, and legal standing. In other words, Americans have widely accepted the concept of a professional— an individual with autonomy and decision-making power. This is fundamentally different from a worker, someone who gets hired to fulfill a specific task as determined by a boss. Workers sell their bodies and time to complete a task; professionals apply the principles of a field to support an enterprise. It’s not a new concept by any stretch. The Hippocratic Oath is often referred to as an early professionalizing document. Doctors took it as a way of internalizing disciplinary principles. They were beholden, first and foremost, to shared ideas rather than any specific lord, superior, or employer. The professional oath traveled through millennia. In the 19th century, the American Medical Association developed a code of ethics, one that transcends any given hospital, office, or state. And in the 20th century, osteopathic doctors began taking a specific oath despite their place of employment. In the legal world, lawyers take a formal oath that binds their work to the state. They get licensure from governing bodies beyond any specific firm, and if they misbehave, reprimands come from a state or federal office.
Granted, inmostfields, like academia, specific institutions—notstates or interstate agencies—determine a worker’s value. But the atmosphere has changed even for geographically isolated universities. In the cultural trend toward professionalization, institutions must try, or look like they’re trying, to support and retain employees. Thoughtless or openly dismissive administrative offices are becoming anachronistic. It’s not okay to treat professionals like unwashed and unwanted stepchildren. If you hire someone, you owe them a dignified experience, which means abandoning people, processes, and discourses that undermine dignity. Ignoring such basics means ignoring an historical turn. It means clinging to a dying—or at least decrepit—ideology. It also means constant employee turnover and its significant monetary costs.
The Protestant work ethic had its day in the cultural sun. Granted, it’s still here, its weight pressing down like a carcass. But it’s no longer camouflaged. We all see it now. We can point to it, wave at it, flip it the bird. We’ll continue to work, find new jobs, or even stick with jobs we hate. But now the old PWE must share cultural space with other ways of thinking.
We don’t yet have a name for our new work-related contraption. It’s not simply gig work or transient professionalism or anything so narrow. Whatever is shouldering out the PWE will require some broader philosophical element. It will need to lace into our conception of time, life cycles, and purpose. It’s one hell of a mission. An ideology with such dharma cannot just materialize, but I believe it’s been slowly developing, methodically coming to fruition, seeping across economic sectors. And whatever it becomes, it will need to remain stealthy. To have lasting power, and become camouflaged by familiarity, the new contraption must not perform itself—as did the hippy/drop-out movement of the ’60s. As we all know, Flower Power had spiritualism, a socio-economic notion (communal life, shared effort), and a vibrant anti-authoritarianism that attracted the country’s youth. From the mid-’60s to the mid-’70s, hippies were a major cultural force. But the hippy movement also had a fashion sense, which is an ideological comorbidity. In other words, if you want to maintain an ideology, you can’t dress it up. Flower Power died as dramatically as it lived.
What does this mean for each of us? What happens now that the old PWE has an emergent competitor? It means we’ll have options lurking in the intellectual ozone. When the job sucks, when we’re expected to endure institution-made hardships (in the form of indignities, sneers, and harassment), when pay does not correspond to our labor, when we wake up and realize we’re feeding a parasitic aristocracy and getting fed little in return, more of us will leave it all behind and try something else. Our awareness will lead us down new paths, and those paths will not be lonely.
*
It’s been three years since our return to Michigan. In our new life and old latitude, we adopted two dogs, Brittany brothers whose ferocious joy and bombast shape each day. My wife is thriving. After the miscarriages and everything else in/about southern Ohio, she is full of professional and personal muster. We were fortunate. The miscarriages, along with one D&C, occurred before such incidences were framed by sharp political edges. We were lucky to have choices, freedom to exit, and smooth roads on which to blast the heck out.
My own professional trajectory continues. It’s true that nobody can take my degrees away. I still have a PhD, but beyond university walls, it means little or nothing. More aptly, it does little or nothing. It has no function beyond the Victorian taxonomies of academia. Instantly, the moment I sent my resignation email, I felt cut loose, adrift in a big socio- economic sea. It’s taken some pivoting and reorienting. Today, I earn a little less than my former colleagues. Fine with me. I am no longer wondering if I’ll take a bigger course load next year. Also fine with me. I spend no time interpreting college edicts on labor, dodging grossly exploitive politics, or recovering from college-wide meetings where faculty members are told that everyone is replaceable. All fine with me. I’ll admit the PWE, or some shard of it, still rides on my shoulder. It’s a heavy presence. While busy writing, editing, and managing Brittany initiatives, I sometimes feel as though I should race to campus, run to class, or log into a meeting. And in grocery store aisles, I wince at prices. Should I keep buying organic produce? Get a jug of cheap wine instead a decent Côtes du Rhône? I realize the origin of the question: I’m no longer doing daily work that drains me completely, that makes my hands go numb, so I shouldn’t feel comfortable spending money. And I don’t. But something fundamental has changed—not just my schedule, blood pressure, and migraine frequency, but my sense of rightness and worth. I no longer feel like a dying rabbit.
John Mauk taught college writing courses for twenty-four years. During his career, he developed several widely used textbooks, worked closely with teachers around the country, and was twice elected professor of the year. Along the way, he studied fiction. His stories have appeared in journals such as Salamander, Arts and Letters, The Forge, New Millennium Writings, Main Street Rag, and The Dunes Review; his nonfiction in Rumpus, Beatrice.com, Writer’s Digest, and various anthologies. He has two full-length story collections, Field Notes for the Earthbound and Where All Things Flatten. He has judged for national writing contests, consulted for publishers, and read for magazines. He currently hosts Prose from the Underground, a free video series for active writers.
