A few years ago, the EU declared that in today’s digital search landscape, individuals have the right to be forgotten. Essentially, our full digital histories should expire and we should be able to remove ourselves from the record for reasons of privacy and crafting personal natives.
You might be surprised to learn that, as a historian, I agree. People deserve to be forgotten. As the study of history has increasingly engaged with ethical questions about whose history we learn and why we have more access to some people than others, we have dug into the ways in which knowing more is harmful. Perhaps the best example of this is indigenous history in America. From Native American remains held in decontextualized anonymity to finds that potentially challenge indigenous histories, Native Americans gain very little from allowing their pasts to be studied by outsiders. The history that is most useful right now is indigenous relations with Europeans and their descendants, which help illustrate repeated abuses and have a direct impact on indigenous rights today. While this political use of history might be unsavory or uncomfortable for some people, who are perhaps used to thinking of history as objective truth, acknowledging that history is always studied within the current frame of reference highlights the politics behind what we have otherwise chosen to remember.
Recently, I’ve been applying this in my own life outside of my work as a historian. If I recognize that a group of people might have valid reasons not to allow the study of their own past, I begin to understand that there is not objective value in every thought or every question – not even mine. I often struggle with a flood of ideas, and the experience of this can be emotionally overwhelming and even physically painful. When I was immersed in art as a teenager, I would experience this as an intense anxiety to commit visions to paper, or whatever surface was readily available, including the leg of my jeans or my left forearm. Having this kind of creative drive is applauded – look how many thoughts you have, you’re so insightful and inspired! No one ever said to me “maybe you should take a deep breath and forget some of them”.
The idea of forgetting anything filled me with panic. I protected my hoard of drawings and poems like a dragon. I pay for Dropbox so that I cannot possibly lose anything I’ve written, going back 20 years. And while it’s interesting to go back and see my own personal history, and sometimes I learn something about myself, these artifacts are not ultimately essential to my own narrative and I would be fine (though very disappointed) if they were gone.
I’ve started to see my brainstorms differently. What if I’m not struck by inspiration, but by a wave of anxiety? What if my panic isn’t a response to the idea that I might forget, but what is motivating it? What would happen if I let that thought just slide by?
Our culture discourages this way of thinking. We love hypotheticals about near-misses and paths not taken. We love the threat of a great achiever never getting their moment. We impose on ourselves the anxiety that if we don’t record it, we will never get credit for it, be able to use it as proof – perhaps it never even happened.
So what?
What if it never happened? There are so few events or ideas that are truly pivotal and one-off. My mom used to say to my brother, irritatingly but correctly, when he forgot what he was going to say: “if it was important, it will come back to you”. I would add “and if you relax”. Thoughts are like waves, and just because they go out doesn’t mean the ocean is gone. Allowing the thoughts to go by isn’t just for disturbing or negative ones, it’s for provocative ones of all kinds. If I sit with them, allow them to leave and come back, I can work through whether they are really worth pursuing at all, develop them through different iterations, and explore several futures. Once I stop being so precious about every thought, I start to realize how often I have similar ideas, I can see the commonalities, and I can get to the core concept that I’m really interested in.
I tell my students constantly that they have to take breaks while they are writing, so that they can forget what they wrote and who they were when they wrote it, come back, and criticize the person who used to be. Forgetting is an important aspect of self-criticism, not because we’re never going to remember again, but because we need the opportunity to choose rediscovery.
When I moved from New York to Minnesota for college, my greatest culture shock was the bread.
I remember the first time I ordered a sandwich on rye, knowing full well that the cafeteria deli counter could not possibly have what I grew up calling “Jewish rye” – a crusty, slightly dense and squishy loaf, often covered in caraway seeds and thinly sliced for pastrami sandwiches or toasting. Still, my morbid fascination faded to simple disappointment when what I was handed was marbled sandwich loaf – the wrong taste, none of the texture.
I came of age during the era of the Atkins diet, when New York’s bread culture was demonized by the very population that was devoted to it. So perhaps I didn’t realize how attached I was to my daily bread. Everyone I knew had a paper bag of bagels in their freezer, and understood the protocol to slice them before freezing (ideally on their second day, when they were no longer fresh enough to be eaten untoasted but not yet rock hard to the point that slicing them was a safety hazard) and to revive them (microwave on a paper towel for a few seconds, then toast). Even as I had watched the institutions of Ashkenazi bread going out of business (first my local bakery, where I had once seen the proprietor snatch a cake out of the hands of a customer she was fighting with and storm off into the back of the shop, then the famous H&H bagels) I still nested in the comfort of dense, Eastern European breads that accompanied most meals. Bagels were for breakfast, lunch, a snack, or a dinner of desperation. Rye, pumpernickel, baguettes, Kaiser rolls, and Italian loaves were the foundations of heavy lunch sandwiches, which had meat, spicy brown mustard, and lettuce and tomato, but never cheese. Tiny versions of these breads accompanied soup, especially at every diner in the city, which coincidentally had exactly the same schedule for soup of the day (Thursday was split pea, matzo ball was always available). Challah was of course for Friday and Saturday, but packaged challah, including mini versions, were a known snacking hazard, around which circled an etiquette of slicing versus pulling off a single knot of the braid. Leaving New York meant leaving not only the ready availability of these breads, but the standards that surrounded them.
I knew I would never find bagels like in New York (let’s be generous and say “the tri-state area”). I was aware enough of the rest of the country, and had already been told so many times by adults I knew, that bagels could only be found in the Greatest City in the World(TM). There were ways of working around this. I always brought bagels back with me when I visited, and asked others coming to see me to bring or send them as well. I learned about acceptable substitutions (Finagle-a-bagel in Boston was great, Einstein Bros is fine). But I also simply learned to live without. It was a big adjustment. I didn’t love bread, I just loved my bread. I didn’t know how to eat the way Minnesotans ate. Boston was close enough to be adaptable but far enough that I never quite settled. When I moved to California that was yet another adaptation. American food culture also changed drastically in those years, with the attack on fast foods, the rise of the farm to table movement, and the explosion of both chef culture and home cooking. The food template I had grown up with in the 90s (soup/salad/sandwich for lunch, protein with vegetables on the side for dinner), which was boring but satisfying, was being upended to eventually arrive in our current moment where it is easy to be vegan and nearly every meal is best served in a bowl.
And then Boichick appeared. The shakeup that was the New York Times declaring that the best bagels in the country could be found in Berkeley, CA cannot be understated. H&H was gone. There were no good bagels to be had in New York anyway. It seemed like everyone had left New York. My family kept telling me that all good bagels were in New Jersey (traitors!) and something about Jersey tomatoes (whatever that is). And here, the Paper of Record (TM) was declaring that the long-running migration of New Yorkers to the Bay Area had finally yielded not only passable but exemplary bagels mere miles from where I had transplanted?!? Everything was on its head.
They were good. They weren’t life-changing. But the culture was all wrong. They were too expensive. I had to wait on line? Outside? For 20+ minutes? I had to plan the day I wanted to pick them up? This was all contrary to the culture. I remember a Time Warner Cable ad that ran when I was a kid of impatient people standing on a long line to use a payphone (a dated but accurate depiction of city life in the 90s) and it said “In New York, wait is a four letter word.” I was very confused, because “wait” literally is spelled with four letters, and at that young age I didn’t understand that “four letter word” meant a curse or a bad word. Eventually I got it, and this ad has burned into my brain. A true New Yorker would never wait. If we know there is going to be a wait, we plan on a waiting activity – “let’s go to that cool restaurant, we’ll put our name down and then walk two blocks to a bar, have a drink for 45 minutes, and by then our table will be ready”.
Now, five years into the bagel renaissance of the West Coast, I am in Seattle, where there are numerous quite good bagels to be had. They are not and cannot be New York bagels, though. For starters, the West Coast bagel is a different kind of bread. The “correct” bread method, typified by the sourdough craze of the early pandemic, is the slow, multi-stage rise from kept starter, to produce a pocketed, fluffy, slightly chewy product with a crisp crust. Ashkenazi bread is not these things. It is dense and squishy, with a distinct crust that is more like a skin (my mom used to peel the skin off of a bagel and put it back in the bag, which we mocked her for relentlessly). West Coast bagels are delicious, but they are different. But beyond the product itself, the culture is necessarily different as well. The craze has died down, so wait times are less, but people are still willing to wait without multitasking, because this is the West Coast and everyone doesn’t move through life like a shark, saying things like “you can sleep when you’re dead”.
Perhaps the most disruptive element of West Coast bagel culture, though, is when bagels are available. Bagels are a breakfast food here, served as sandwiches (typically toasted) with a variety of toppings, and only in the morning. I’m not saying no New York bodega has ever made a bacon egg and cheese with a bagel, I’m just saying that’s not the default (fun fact: my Colorado-born husband used to order a bagel with cream cheese and egg and it always sowed confusion). Few bagel places are open after 1, and the ones that are have been cleaned out of everything except the strangest flavors (no one wants a pumpernickel blueberry, Blazing Bagels). I have also recently discovered that bagels are largely understood as a weekend breakfast food, and so most of the popular stores are closed on Tuesdays. I don’t know what to do with this. I want a bagel for lunch in the middle of the week, when I have no capacity to make complex decisions or prepare lunch from home anymore.
Recently, a new place opened in University Village (an upscale outdoor mall) called Hey Bagel. Hey Bagel offers the Disneyland version of the New York Bagel Experience (TM). They don’t toast, they don’t make sandwiches. In the style of H&H, you ask what’s hot and you can get it with a package of cream cheese (although these are small and available in several flavors, not a brick of Philadelphia original). You’re mean to tear off bites of hot bagel and drag it through the cheese, like the bread and dip it is. Kenji Lopez-Alt shared an Instagram Reel of Hey Bagel and now they can’t keep up with demand.
Will West Coast bagel culture ever recapture late-20th century New York? Almost certainly not. It’s still interesting to see the push/pull of these two cultures trying to find an equilibrium where everyone can enjoy delicious baked goods. For myself, I’ve accepted the regionalism of food. I make the most of every trip home, even if it means I have to go to New Jersey (or settle for Tal). I make some things myself (mostly chopped liver, though occasionally bagels or rye). I don’t want to be able to find exactly the same culture in Seattle as in Manhattan; they’re different places, and there’s a reason I left. It was good to have the push to try different foods. It’s good for me to understand the regional character of where I live now, and to see clearly the regional character of where I’m from, not as the default, but as another way to be. It’s even highlighted for me the Seattle-New York connection, the smoked and cured salmon industry (and Atlantic salmon cannot compete with Sockeye). And, perhaps the truest element of New York culture, I need something to complain about.
In the Passover seder, there is possibly no more controversial section than the Four Children. As the seder sets up a series of conversations and pedagogical devices around the Exodus story, the Four Children is the most explicit set of instructions for parents to teach their children how to internalize the preferred message. The four children – the wise one, the wicked one, the simple one, and the one who does not know how to ask – each ask in their own way about the meaning of the story. The haggadah, the guidebook of the seder, then offers a set answer for each one. The controversy mostly surrounds the wicked child, whose phrasing “what does this story mean to you” seems largely innocuous and doesn’t seem to warrant the harsh response that, because the child has distanced themselves from the leader of the seder and the Biblical Israelites, they should in turn be distanced from the community. This call and response is particularly offensive to the analytical, inquisitive, and skeptical liberal Jew, and from that perspective I and almost everyone I’ve ever made a seder with has bemoaned this portion of the lesson. But this Passover I find myself drawn to the child who does not know how to ask.
The setup of the four children implies differences in both age and intelligence. And most haggadahs (fine, haggadot) will illustrate this section as children in descending age order, from early teen (the age of religious adulthood) to toddlerhood. That’s how I’ve always experienced it, especially since there were four children in my family when I was a kid, and, importantly, the wicked child mapped appropriately onto my impish second brother. The child who does not know how to ask is therefore often depicted as a pre-verbal or otherwise very young child who literally lacks the cognitive development to comprehend the basics of the story or to phrase a question. And so the response to simply repeat the most basic message of the story to this child (“this story has meaning because of what God did for us when They brought us out of Egypt”) seems both obvious and patronizing.
But as with every aspect of the seder, this setup is also a metaphor. And coming around to the end of my first year of teaching as a professor, I could easily call them the four students. Put that way, I’m thinking about this as a pedagogical framework, a way of fostering discourse and managing my classroom.
The reason that the child who does not know how to ask is drawing my attention is because I am faced for the first time in over a decade of teaching with students who exist passively in my classroom. Literally, children who do not know how to ask. Before, I encountered students who engaged in the material in its intended spirit, asking deeper questions about the specifics and the implications of the lesson (the wise child). Students who intentionally derailed the conversation with personal challenges, loaded questions with thinly veiled political agendas, or rude comments (the wicked child). Or, like the wicked child, students who had no genuine interest in what I laid before them, but only asked me for my interpretation so they could parrot it back, thinking that would get them the best grade. And of course students who requested extra repetition of the material in order to comprehend the basic facts and move towards a superficial interpretation (the simple child).
Although not all students actively speak in class, my experience was that they all communicated engagement that fell into one of these categories. Still stuck in the literal interpretation of the four children, a student who does not know how to ask would be a student who is so far below the level of the material that they cannot begin to experience their own reception of it critically – not only would they not know how to ask me, the instructor, a relevant question, they would not know how to ask themselves the basic self-assessment questions to gauge their own comprehension.
In the study of history, we don’t really believe that any material is beyond a student. We think there are methods of analysis that are complex and require some perspective to fully grasp, but ultimately all of these are accessible at any level, at least in part. In that vein, my public humanities project, The Medievalist Toolkit, has been experimenting with introducing students to medieval history through its uses and abuses – rather than starting with the time period itself and adding on critiques of its politicization in later classes, we tell students about how later peoples have imagined this time period from the start. This has been quite successful, because it’s not advanced material, it’s just more complex. We don’t always have to go from the simple to the complex if we trust our students to ask good questions.
So what is the student who does not know how to ask, and why am I encountering them for the first time this year?
My students who do not know how to ask are not incapable of comprehending the material, or even of checking their comprehension. They are disengaged. They are passive. They do the readings like they are lying at the edge of the ocean, letting it wash over them without trying to swim or even float. They wait for me to justify why the subject is interesting, entering the classroom without any drive of their own except the obligation of earning a degree. The student who does not know how to ask is passionless, uncritical, and impenetrable.
Any teacher working now can tell you exactly what is producing these students. It is the COVID-era cocktail of emotional burnout, stunted schooling, and a rapidly devalued education. It’s not that I haven’t had to justify the value of history to my students before – it’s history, half of the world’s most-hated subject. But that students who went through all of high school during COVID are on an educational conveyor belt that is constantly at risk of breaking. There’s little room in brains fogged by anxiety and taxed by traditional modalities for genuine interest in the subject matter.
This sounds too harsh, and I don’t mean that my students are dull. But I do see that as soon as they enter the classroom, the Zoom screen goes up over their eyes, and four years of passive education under strained circumstances take over.
And this is where I find myself surprisingly drawn to the haggadah’s answer for how to approach this student: just teach them anyway. Explain the material, tell them the meaning of the story as I understand it. Offer them an answer to a question they didn’t ask. It’s a harder job than the haggadah makes it out to be, since I have to bring all the energy to the room, and smile through weary sighs and blank faces, and offer extra engaging tidbits that would normally come from the Wise or the Wicked or even the Simple students in the room. It’s even more of a one-woman show than teaching is under “normal” circumstances.
I would be remiss if I didn’t also mention what the story of Passover means to me. Passover is not about the land or even the time of suffering. It is about building community through stories, conversation, and food. It is about taking on the pain of another person in order to fight against injustice. It is a course in experimental pedagogy. On Passover I focus on the suffering that is inflicted in my name and work to end it. Especially while I am teaching my course Landscapes of Medieval Mediterranean Religion, “next year in Jerusalem” is a state of mind, a vision to restore the historical pluralism of a place and to achieve a reality in which everyone has a deeply rooted home.
My name is Robin, and I have email anxiety.

Ok so some of this (dare I say) fear of my inbox comes from a reasonable place. While I was writing my dissertation and applying for jobs, emails were often life-changing, soul-crushing, and heart-rending. There were also times that I missed extremely important, time-sensitive information because it got buried in my inbox or I waited too long to work myself up to open it.
But I’m also finding that these days a big part of my email anxiety comes from the time dealing with emails demands of me. Emails are not just messages. They are reminders for action, they contain important instructions that need to be followed carefully, and they are delicate communications that set the tone of professional and personal relationships. There are also so many of them.
When I started working from home in 2019, I began to break up my day by types of tasks. I knew that I did my best writing between 10 and 2 and that I’d be most emotionally prepared to deal with other people before or after that. So I dedicated the first hour of my morning to emails. Back then, it was just thoughtful messages – me asking other people for advice or opinions, other people asking me to set up meetings, and the occasional communication about my writing. Now my emails are event invitations, course content from my past self, endless back-and-forth exchanges to set up meetings and trade information, and of course mountains of cc’s. Dealing with my email is no longer an hour a day, but days worth of triaging leading up to twice-weekly 2-hour sessions where I can sit at my computer (as opposed to my phone) and do all the important actions my emails demand of me, like submitting grades or using my benefits. And this is simplified from last year, when I had three separate work emails, between my grad school, my teaching job, and my fellowship.
I’m not kidding myself, either. I know that this is just the beginning. I try to imagine how many emails my graduate advisor or my department chair gets. I’m amazed they can keep their heads on straight. But as with all aspects of the profession, I know this is a skill that will improve in time as long as I don’t run from it. The first step has to be letting go of the fear of emails. I let go of the fear of academic criticism (or maybe I’m just numb to it) and it’s making me a better writer. Now I need to get out of my own way and just answer my goddamn emails.
During the pandemic lockdown, I was finally convinced to watch Outlander. When the show first aired, it gained immediate notoriety for being a steamy romance. What I found was a shockingly boring military historical drama about the Jacobite rebellion of 1745 that, yes, was interspersed with a significant amount of explicit sex, but seemingly half of that was assault. Later seasons change the setting to drop the characters into the highlights of 18th-century Atlantic history – pre-Revolution France, Caribbean piracy, pre-Revolutionary America (with a brief jaunt back into the 1960s with glasses on to show time has passed) – but the formula remains the same. Each season has a military/political focus with some good costumes and cultural fact sheet for color, punctuated by equal amounts of consensual sex and extremely disturbing sexual violence. I made it much farther than I really meant to given how much I disliked this setup, but when literally every main character had been assaulted I decided I was done.

Outlander may be pulpy fantasy (after all, the motivating plot device is time travel), but it encapsulates the issue with most historical fiction. It’s a three-part cocktail of conceits: 1) the main appeal is the fantasy of entering an “exciting time” in history, defined by major military or political change; 2) costuming and set are themselves central features of the storytelling because they are essential to setting the scene; and, most importantly, 3) despite the focus on this period, we must remember that it is better to be alive in the present than in the past. So, what seems to many people to be a fun glimpse into the past in order to “learn history”, like broccoli cheddar soup you’re not so much eating your vegetables as throwing a few nuggets of nutrition into otherwise empty calories. This approach is, to me, the greatest disservice to the telling of history, because for all of the effort to bring history to life, it creates more distance between the story and the audience. From my perspective, history is made real through compassion for the complexity of life in the past. But we can’t have compassion if we enter into these stories clinging to the notion that life must be better in the present, that the redeeming features of these stories are the lasting impact of the events and the escapism of their settings. Devices like sexual assault in stories such as this serve to prop up these expectations: the second Outlander’s Claire enters the 18th century, she is subjected to a constant barrage of threats of sexual violence, underlined by her crusade to bring “modern medicine” to a time literally plagued by smallpox.

Historical fiction is a genre that I keep returning to in the hopes that I will like it more this time, although I rarely have. More so now in movies and tv than books, I find myself struck again and again by how much this genre depends on suffering porn and I remind myself to avoid it. This, of course, is the basis for the “gritty realism” of Game of Thrones and its claims to historical accuracy to the Middle Ages. But this basic disrespect for the past, this inability to tell historical stories without focusing, to some extent, on suffering, runs through even relatively benign or seemingly joyful properties. I had this thought initially when I watched the remake of A League of Their Own, which reorients the story of the women’s All American Baseball League around the queer women who could have found refuge in it and the Black women who were excluded from it. Abbi Jacobson, who starred in and wrote the show, said that one of her goals was to show gay joy, to avoid the “bury your gays” trap. And yet the show itself has nary a moment of queer women living as themselves without the plot punishing them for it, through the consequences of being found out, through police raids, through the social complications of marriage. That plot progression of showing gay joy and following it immediately with severe consequences would have been right at home under the Hays Code, the repressive standards that censored movies and tv between 1936 and 1968, which allowed the representation of things considered “socially undesirable” so long as they were punished. It’s not ignoring the historical reality to show people existing as themselves in the societies they made without constantly following it up with violence, retribution, or tragedy.
In contrast, HBO’s The Gilded Age takes almost the opposite approach, with no violence, no sex, no suffering for the lower class, and little more but the slightest racially-motivated social snubs during the 1880s. For a time period defined by extreme class inequality, all the show depicts is the lives of New York aristocracy, with occasional glimpses into the seemingly stable and normal lives of their servants, and infrequent jaunts into the rosy utopia of segregated Black society. The most ill will the show has depicted is tied between a nice gay man who plots to marry a naïve young woman for her money and a racist lady’s maid who tries to discredit the show’s only major Black character, perhaps because she is personally frustrated with caring for her elderly and belligerent mother. Neither plot has yet proved successful. Presumably, part of the justification for making this show was the recent public debate about whether we are in a new Gilded Age ourselves, and yet this show does nothing to depict why that might be a bad thing. It almost seems to apologize for a a more civilized age of class disparity when millionaires were guided by morality and the poor all had jobs.
Horns of our dilemma, 1890
Another recent somewhat fanciful historical fiction that gets a bit closer to good history is Amazon Prime’s The English. The English goes in guns blazing with social agenda, taking the Western genre and turning it on its head to focus on women and indigenous stories. It doesn’t shy away from violence, but it doesn’t glory in it either – sexual assault is seen in the aftermath or off screen, as is violence both individual and en masse toward Native Americans. The violence on screen is white men shooting each other, which is in an ironic way very true to the genre. This is a true centering of typically peripheral stories, allowing the characters to both show and tell the difficulties of their lives in this patriarchal settler-colonialist society. If anything, what is unsuccessful in this telling is how the plot falls back into the tropes of historical fiction by making these elements gain their power over the story through reveals, rather than using them to drive the narrative from the start. Rape is once again a plot device, rather than the inciting incident for the protagonist, which makes it seem more like a feature of the world in which she lives than the atrocity it was. Moreover, the grit of this show gives it that air of “at least things are better in our time”, when we know full-well that circumstances have largely not improved for indigenous people and that women are often subjected to the same treatment as in the show. If anything, the punch of the show could have hit harder if it looked a little more modern.
https://www.memedroid.com/memes/detail/3580852
So have I ever been happy with historical fiction? Yes. I would say the best historical movie/tv show I’ve seen was HBO’s Rome, which is absolutely full of sex and violence (though surprisingly more restrained in sexualized violence than its spiritual successor Game of Thrones) but brings ancient Rome to life in a complex and current-feeling way. The first season of Rome is driven by soldiers trying to find their places in life after war, an ambitious mother trying to push her children into places of political prominence, a politician trying to hide a disability, and a family trying to hide the true parentage of an infant. At the same time, there is a woman kneeling naked under a slaughtered bull. It’s not perfect – there are inaccuracies, it’s still a very military/political narrative, and it absolutely does not pass the Bechdel test. But it makes the distant past feel relatable in a compelling way and it manages to find a new narrative in a story that Shakespeare already made famous. One of the things that I think is most successful about this is that as much as the fictional characters are wrapped up in the real events leading up to the assassination of Julius Caesar, the story isn’t trying to explain why that happened. In fact, that event is itself used more as a plot device that helps to establish the emotional weight of all the other things that come to a head at the same time.
For a completely different approach, the best historical novel I’ve read was Sarah Dunant’s The Birth of Venus, which is a personal narrative driven by social relationships and art set in 15th-century Florence (although I don’t read a lot of novels anymore so this book is already 20 years old). This is a story that could have fallen victim to a lot of overdone tropes of historical suffering, especially the Plague, which frankly would have been stupid (and I’ve read a book like that, Geraldine Brooks’s Year of Wonders, it was terrible). Instead, it deals with a young woman’s maturation into adulthood and she could have navigated it within that society. What made this such a beautiful book is that, unlike most historical fiction, it isn’t trying to tell the story of a pivotal event, but instead using a historical setting to get at a human experience. As an art-loving teenager, I found this book incredibly true to life, putting equal emphasis on the struggle to paint beautifully and anxieties about finding love. It doesn’t have to be set in Florence, it could have been 1910s Paris or Egypt in 2000BC. That’s the point. The difference in setting helps reveal something essentially human. That’s why I was drawn to fantasy literature as a teenager as well. Some settings make it easier – there were, after all, a lot of opportunities to meet painters as a wealthy teenager in 15th-century Florence. If we focus in on historical settings that help us reflect on the present, we are also learning and appreciating more about the past. But if we are only using historical settings as window dressing, as superficial routes into the past, we aren’t really learning about either, because we can’t see past our expectations to get to any kind of meaningful depth. If one of the marks of good storytelling is that the characters grow or change, then in historical fiction the audience must be a character, we must experience growth as the story unfolds, because our time period is always implicitly invoked by the historicity of the story’s setting.
https://www.invaluable.com/blog/art-history-memes/
If you’re not convinced, I have a final thought. If we return to Shakespeare, there’s something about his plays that we don’t often put in perspective. Those stories were not set with a sense for “historical accuracy”. As Shakespeare scholars have pointed out, Caesar and other ancient characters in these plays were mostly dressed as Elizabethans. This is probably not for a lack of knowledge – after all, medieval and Renaissance peoples were surrounded by depictions of ancient Romans, from the art on the walls of Catholic churches (though in England much less so after the Protestant Reformation) to extant statues and mosaics. Instead, it has to do with relatability. Shakespeare’s plays always had elements in the writing that were meant to draw in audiences from different backgrounds, such as the wisecracking peasants who are often prominent secondary characters. By costuming the performances in Shakespeare’s present, these productions removed the historical obstacle of the setting so that the audience could focus on the emotional thrust of the play. In our media, we are obsessed with historical accuracy, even for relatively recent settings. We are laboring under the tyranny of trying to get in the mindset of a different time, when it is impossible to escape the present. Slavish adherence to setting is often a distraction, both to the production team and the audience, from what the story is about and how it can spark meaning.
History of the World, Part 1 (1981), dir. Mel Brooks
At the start, I offered a set of guidelines that describe how historical fiction is often written. Here’s an alternative: treat historical fiction as fiction first and history second. Serve the characters and the plot before the setting. Allow the story to be driven by narrative beats rather than pivotal real events. And, perhaps most importantly to me as a historian, acknowledge that we as the producers of this art and its audience exist in the present and do not have a full appreciation of the past; we cannot fully remove ourselves from the telling of this story and we can appreciate it better when we acknowledge our relevance to it.
Or why I study the Middle Ages and dress like the 1940s.
If you run in vaguely historical or vintage fashion-oriented circles online, you’ve probably run into the Vintage Egyptologist (whose work I can’t condone) or maybe, more recently, the Overdressed Archaeologist (whose work I absolutely encourage you to check out). These two women combine interests that are seemingly a bit at odds: a love of vintage fashion and style with the informed pursuit of ancient history. Why these things seem at odds is pretty immediately revealing. We often contrast women who are interested in their appearance with women of substance (though they are not mutually exclusive). We also see history as divided into distinct time periods and geographies that shouldn’t be combined – if you are interested in the very distant past, you must not care much for the more recent past, what with all its technology and liberal thought (ha!). But this particular combination of interests also feels very right and I can sum up why in one name: Indiana Jones. The most famous archaeologist, fictional or otherwise, unites interests in the study of the past with the aesthetics of the 1940s, the fantasy of a myth with the truth of what we find in the ground, and the tension between the order of museums/fascism with the freedom of being an American who can run around looting artifacts for a “good reason” and punching Nazis in the face along the way. Indiana Jones is all about fantasizing in two settings at the same time.
“But in Latin, Jehova is spelled with an I.”
You see, we don’t just want to be the discoverers of the past. We want to be people with style, just as we remember the recent past to have been, struggling against the limitations of our society to make sense of a far more distant one. We want to think of ourselves being remembered even as we are doing the remembering. We want to signal our importance by fashioning our personal images more intentionally. And we want to simplify our study of history by framing it, depending on your view of the 1940s, within a time that was either simpler and thus easier to do “objective” history within or with more clearly archetypal politics and thus easier to make current social critiques within. The 1940s (and the century of archaeology and history leading up to it) are an intentional choice for this setting. It was the end of an era, the imperial age, before it became taboo for large powerful nations to go into other countries and, while exploiting their natural resources and human labor, dig up their history and decide what it meant. At least, it became taboo to do that explicitly. It was during this time that the modern concept of history was defined. We looked at ourselves in the present and declared that we were outside of history, separate from the past, and that we could thus determine what the past was with pure objectivity. This is also the reason that time travel narratives don’t really exist in fiction before the late 19th century. This period saw the development of a complex and highly specific understanding of ourselves (by which I largely mean white Westerners) as our own creations, the end point of a series of choices about progress and civilization. As a result, we are both fascinated by and able to investigate all of the prior and “traditional” societies that did not seem to exhibit signs of such intentionality (although they are and have always been just as intentional). It’s for this same reason that a generation later, the kids raised on this kind of thinking and this approach to the past, felt there was nowhere left to go but up. “Space, the final frontier” encapsulates in four words an entire era’s dogma.

As an American medievalist, this period I’m describing from around the mid-19th century to the mid-20th is particularly important, because it’s when my field was created. America doesn’t have a medieval past, at least, not one that can serve as a foil and point of origin for a European society. Rather than find meaning in the Native American past that this country was rapidly erasing, Americans of European descent, particularly those with disposable wealth or some kind of European pedigree, wanted to stake their claim in European medieval history. They did this in very real, material ways, such as by collecting medieval antiquities, including entire buildings. This is a good moment for us to stop and appreciate what this kind of collecting really feels like, since it is the same process by which artifacts from all around the world have ended up in European and American art and cultural heritage institutions, now very controversial. In the wake of WWI, American art dealers traveled to Europe and picked through the remains of French towns destroyed by the war, then brought back everything from illuminated bibles to jeweled crosses to, again, entire buildings, and sold them to wealthy American industrialists like the Rockefellers or financiers like J.P. Morgan, who eventually donated them to museums as a kind of very conspicuous philanthropy. For white Americans, this kind of story might help illustrate why people might be very insistent that museums return artifacts to the countries they were found in – because the taking of those artifacts represents current political conflicts and violence. This is why romanticizing the era in which most of this collecting happened is at best oblivious – the lifestyles of the people who did this kind of antiquities collecting were very much a part of the disregard they exhibited toward the modern civilizations that lived on top of those antiquities. Appreciating archaeology as destruction (a favorite phrase of my college archaeology professor), also helps demonstrate how the perspective of one generation of men could influence a hundred years of the study of history – these objects and manuscripts became the basis for an American understanding of the medieval European past, complete with the idea that Americans have a connection to the medieval European past through our white ancestry. Said plainly – the study of the European Middle Ages in the US is based in white supremacy.
In case you couldn’t already tell, it’s a complicated time to be a medievalist.
When I think about my field, I now can’t help but think about the people who made it, the people whose interests guided how I have come to interact with this material. It’s not just something I’ve inherited. Charles Homer Haskins, the founder of American medieval studies, is the person whose claim I am directly arguing against in my dissertation. He said that we should study the brief period during which Latin Europeans controlled southern Italy in the 12th century because they brought back into Latin a knowledge of the Classical sciences that had been ceded to Arabic and Greek for hundreds of years. He justified an interest in this particular time and place based on the direct value it provided to the place we would come to understand as Europe. He drew a circle around all the people who could read Latin and left everyone else out. He articulated a sense of ownership over knowledge and not just any knowledge. Knowledge of the sciences, the thing that European empires would argue made them great. He told me, an Ashkenazi Jewish woman living a century after him, to find value in a peripheral corner of Europe because it would be the spark that would make Europe dominate the rest of the world in his lifetime. My struggle as a historian has been dismantling every assumption he made, trying to show continuity where he showed rupture, emphasizing diversity where he saw hegemony.
I don’t know that my habit of styling myself based on a mid-century aesthetic originally came out of my interest in medieval history, but both my interests and my aesthetic might have come from the same place. We often talk about our upbringing in terms of what our parents gave us, but I think my grandparents had a lot more influence over how I understood my history. They told me what history was, whether by taking me to museums or bringing me souvenirs from their trips or even by being part of history. My grandparents very much understood themselves or wanted to understand themselves as European, but the reality of being Jewish and American made that a little difficult (even more so in my grandfather’s case, as a Polish immigrant in occupied Palestine). Their generation, the same one as Haskins and Rockefeller Jr., produced all of the experiences through which I came to know about history. Perhaps unintentionally, they made it so that I couldn’t understand what came before them without having to see it through their eyes.
A sarcophagus-shaped pencil case, now sold at the British Museum. My grandmother gave me a very similar one when I started school. https://www.britishmuseumshoponline.org/mummy-pencil-tin-nespernub.html
I didn’t start wearing mid-century clothes or wearing my hair in a vaguely Edwardian style until late in my grandmother’s life. As a kid, though, I was always wearing the biggest, fullest skirts possible. In my mind, they were better, more real maybe, because they were more “traditional”. I definitely had the sense that they were medieval, although I think back then the image in my mind was more 17th century. Probably the first spark that launched me into the mindset of the mid century was the movie The Hours, which I know is where I got my hairstyle. My grandmother’s apartment was something of a window into the past for me. I would go there and dig through the closets, finding treasures from what seemed like a very long time ago, like an old pocket camera or boxes of war bonds. I once found a copy of Treasure Island that I brought home and placed on my shelf because I liked the look of the green cover. I was inventing cottage core and dark academia out of the things I found in my grandmother’s closets. After my grandmother died when I was a teenager, I became more interested in understanding the time she grew up in, not for its own events, but for its perspective that had left a mark on the history I was consuming and the reality I was living. That especially became true as I became more aware of world events and struggled to understand her generation’s role in conflicts like Israel-Palestine.
To some extent, I think that the mid-century aesthetic is like an iconographic costume for historians. You put on your mid-century clothing to study history the same way you put on your deerstalker and pipe to solve a mystery.
Seriously, this is quite a visual trope.
But more than that, I think the mid-century is the filter through which we in the 21st century see history. We might acknowledge it more now, or maybe we have made that filter stronger over time. It’s certainly a product of how the history we consume has been written, but it might also reflect the way we choose to understand the ways that history has come to us. Whether intentionally or not, premodern history arrives in the 21st century translated into mid-century, and a lot of us historians (and archaeologists, etc.) need to get into translator mode to understand it. I can’t condone the impulse to try to live in the past – that kind of mental transposing is pretty ignorant and if you’re going to stoke an interest in any time period you have to be aware of what that time period means now. But appreciating the perspective of a past generation is an essential aspect of the study of history, so you might as well enjoy the fashion while you’re doing that.
It started with an ill-fated search for a seder plate.
Read More
Happy new year.
Read More




Promotional image for the PBS documentary. The documentary is full of jerky animations in this style, which doesn’t do much to help bolster the scholarship.






