This Was Never the Point

For about 10 years, everything I wrote was almost immediately published. This sort of instant gratification really does a number on a young writer’s brain.

During the onset of the digital age, newspaper editors became hungry for what would soon be called “content” rather than stories — online and in print — and this naturally led newsrooms the nation over to reward quantity over quality. Young, desperate-to-please writers like me who would churn out relatively decent work for pennies somehow became eminently publishable. Sure, I wrote some James Beard Award-nominated work amid the churn but I mostly reviewed strip club food and composed top 10 lists of the best kitchen utensils to use as sex toys. (“Eggbeaters,” my editor had helpfully suggested when I balked.)

Still, I wanted to believe that the bulk of my work was good, actually. And having absolutely every impulse piece you write immediately published in the local paper (see: a series of haikus about Muscle Milk) does tend to make you think the general public gives more weight to your words than is true, actually.

I was among the most prolific writers at each place I worked. And in the heyday of social media and online platforms, this meant my words scattered like dandelion fluff. I was always fascinated to see how far my stories reached. One time, Alton Brown emailed me to say he’d liked a piece I’d written. Another time, a man emailed me my address and threatened to come to my house and rape me because he disagreed with my most recent restaurant review.

When I left the Houston Press in 2013, the art director mocked up a cover of the paper for my going away party. Among the coverlines around my face on the fake paper was the astounding number of articles I’d written during my time there — somewhere north of 11,000. I don’t remember exactly anymore, and although Monica carefully mounted the mock cover on a lovely mat for me to ostensibly frame one day, I threw it in the garbage when I moved out of my studio apartment later that year. The fake cover and the real number, both gone forever.

I haven’t kept much physical evidence over the years of my career as a food critic at the local paper or my work as an editor for the city magazine, which was once monthly and is now quarterly. The paper itself ceased physical publication six years ago and now exists only online. In true Houston fashion, the historic Press building itself was demolished in 2018, no longer leaving any tangible traces of itself here on earth. I wonder if one day someone will be searching through old Houston Press editions and wonder why the paper suddenly evaporates in 2017. (For now, at least, a Google search of the terms “Backpage” and “lawsuit” suffice to explain.)

The paper recently changed the way its website is hosted and all of its archives prior to 2020 disappeared, a further distressing withdrawal from the world. If I hadn’t digitally archived all of my own years’ worth of Houston Press articles a couple of years ago, all trace of the online content I wrote would also have evaporated.

Most of those articles that I thought mattered so much, those pieces I labored over, the ones that ricocheted across the Internet, the ones that won awards, the ones that lost me friends, even the stupid and silly ones like the time I was assigned to write about the worst things to puke up on New Year’s Day — no one will ever read them again. They exist nowhere now except in my own little digital vault. The physical papers possibly exist in some libraries somewhere, but they contained perhaps only 15 percent of those 11,000+ articles I wrote. All those words, all those worried over words, all vapor now.

For a long time, I struggled under the weight of writing all of these stories. When I was first hired as a young, naive food critic, I owed the paper three online articles a day (some of which would later be reprinted in the following week’s issue), a weekly restaurant review for the print edition and at least one cover feature every quarter. Once, it was about butchers reviving a lost charcuterie tradition; another time it was a pandering photo essay about chefs’ food-themed tattoos.

By virtue of this workload, everything I wrote was publishable — or at least we all pretended it was. And all of these articles were chum for the readers.

A dining review is chum in the water by its very nature: Plenty of people wanted to know whether the hot new restaurant in town was good, actually, and many, many more people wanted to read about it when that hot new restaurant was terrible, actually. The online articles were even less subtle.

A best-of burgers list? Done to death. How about a best-public-bathrooms-to-bang-in piece?

Sometimes these were my ideas, sometimes they weren’t. But at the end of the day, it was my byline attached to the story. This made it easier to part with the physical copies of papers and, later, magazines full of content I was embarrassed to witness as my own.

And yet I felt strangely compelled to keep the digital stuff. Its existence is no more or less precarious than the print stuff — there were only so many copies of Houstonia Magazine printed, for instance, and those increasingly rare early editions are certainly dwindling in number now that I’ve put so many of my own through the shredder — so it’s not about choosing to save one over another.

I suppose it’s more the idea that the online stories always existed in such a liminal state to begin with: layers of code and raw binary data that briefly coalesced on your computer screen to form an article about tracking down the elusive Dr Pepper Icee, before snapping back into the Internet ether once you close the tab. Sparking to life briefly, a little flame burning brightly for a moment, then just as quickly extinguished.

For a long time I wanted to distance myself from the things I wrote, for better or worse, because of the way in which I’d let my voluminous body of work come to define me. I left the city magazine as managing editor and stopped writing for public consumption altogether, turning inward to a university where I instead wrote about anyone else’s opinions except my own, keeping my thoughts entirely to my private journals. It was a liberating relief.

No longer did I have to share my weekly thoughts on dining out with rabid Twitter and Facebook audiences that our publications’ owners insisted we were responsible for growing into an even more vociferous crowd, nor write the kind of personal essays in the city magazine that caused my cousin to stop speaking to me. I hadn’t anticipated the toll it would take on me, publishing my every thought because I was so eager to write and because some faceless publisher needed grist for the mill.

It turns out that keeping my opinions to myself and listening to other people’s opinions instead has been deeply therapeutic. This revelation will not rock the world of those stable, empathetic people out there for whom this is just a straightforward recipe for a gentle life. And some people — myself once included — don’t want a gentle life; they want a life that speaks truth to power or at least draws enough attention that they feel seen and heard for one brief moment.

Maybe that’s why I keep the old online stuff in my little vault, to remind me of my non-gentle days, when I was sparking to life in all the right and wrong ways. When I was figuring it all out, trying to fan my flame into a fire, watching it flare up and out of control on the bad days, admiring its bright, steady shine on the good ones. All that struggle, all those tears shed over hitting deadlines or surviving pitch meetings, all those words written and now gone — was all of it for nothing, if those articles are gone forever?

This was never the point.

Still Missing the Point

I was frustrated to see yesterday afternoon that some people – some very important people – continue to miss the point of the Foodie Backlash article I wrote last year for work. I’m more frustrated, frankly, that they continue to bring it up at all, their confused, wrong-headed vitriol only further muddying the initial point. If I don’t understand something, I either let it go or hash it out with someone until I do understand it.

To that point, I wrote this post initially for the Houston Press, then decided that it wasn’t entirely appropriate for the more casual tone of the blog and it went unpublished. But after yesterday, I chose to resurrect it. So here it is: my further explanation of the initial Foodie Backlash article, in hopes that I’ll at least be hated for my actual point instead of any wrongly perceived points.

The Pursuit of Self Via Food: The Good, the Bad and the Ugly

maslow-hierachy-of-needs-minConsidering the wave of “foodie backlash” articles lately — and the rising tide of articles quick to leap to foodies’ defense — very little has been said about the reasons why foodie-ism has gained so much momentum in the last few years.

In the Atlantic two weeks ago, B.R. Myers wrote in his piece titled “The Moral Crusade Against Foodies,” that “it has always been crucial to the gourmet’s pleasure that he eat in ways the mainstream cannot afford.”

And in that brief statement, Myers encapsulates the dark heart of the “foodie issue” as it were: using food as a status symbol in the same way that people use tools like fashion or music to separate themselves from the masses.

In a 2003 conference paper from the American Sociological Association, author Samantha Kwan put forth the idea that food is “no longer regarded as merely the satisfaction of a physiological need low on Maslow’s hierarchy. Rather, food consumption provides individuals a means for the conscious manipulation and display of self.”

More specifically, she states, “ethnic food consumption constitutes ‘identity work.'”

Eight years later, it would be easy to go one step further and add to her theory that conspicuous consumption of the latest food trends constitutes identity work of its own, just as much as shoving your love of Ethiopian food in someone’s face does.

And this, in and of itself, is not necessarily a bad thing. Pursuing a hobby out of love for, say, Ethiopian food is one thing. Pursuing it purely for selfish reasons is another.

Quick crash course on Maslow’s hierarchy of needs: “Identity work” is based on the idea that once you’ve satisfied all of your basic needs — the need for food and water, shelter, employment, friends and family and, finally, more elevated concepts like self-esteem and respect — you’ll seek to satisfy that ultimate goal: individuality, whether it’s expressed through clothing or cooking.

triclinium.jpg
Recreation of a Roman triclinium.

This is, by no means, the first time in history that large groups of people have sought to separate themselves from the masses through appreciation of fine or exotic foods.

More than 2,300 years ago, wealthy Romans were reclining on lecti triclinaris in elaborately appointed triclinia as they indulged in multi-course meals that included everything from foie gras and rabbit to charcuterie and raw seafood. Not quite the Trimalchian feast of ancient satire, but close. Sound familiar?

In his book The Upside of Down, author Thomas Homer Dixon argues that the downfall of Rome can be attributed in part to a scarcity of food resources that eventually led to food crises throughout the empire. All the while, well-to-do Romans were still attempting to one-up each other via elaborate feasts as the general populace grew more and more unhappy with this widening gap — both in terms of wealth and attitude — between the rich and the poor.

And it is this crucial point in B.R. Myers’s article that may have been missed among all the vitriol and viciousness.

“Food writing has long specialized in the barefaced inversion of common sense, common language. Restaurant reviews are notorious for touting $100 lunches as great value for money,” he writes, pointing to the difficult-to-ignore issue that it’s hard to be a “foodie” in a climate where so many go without and when we’re in the midst of a global economic crisis that some consider the worst since the Great Depression.

Myers continues, “And in a time when foodies talk of flying to Paris to buy cheese, to Vietnam to sample pho? They’re not joking about that either.” Kwan, for her part, views these kinds of frenzied flights as no more than “white elites…assert[ing] a specific sense of self.”

foodie.jpg
Photo illustration by The State
Attempting to self-actualize and express your individuality through food can quickly lead to insufferable “poseur” behavior, as demonstrated here.

“These individuals are lured to ‘authentic’ ethnic food,” she continues, “because it allows them to consume literally a symbolic embodiment of the ethnic ‘Other.’ Simply, this consumption is an attempt to align oneself with the ethnic Other and to realize the ‘Authentic Self.'”

Is this attempt to locate one’s “Authentic Self” in another culture’s food — or in multi-course, hours-long tasting menus — necessarily a bad thing? Kwan thinks so: “The consumption of ethnic food separates cultural symbols from the culture that creates them” and, in the process, “dangerously absolves elites from real dialogue with the Other.”

And the same can be said for the continued game of oneupmanship that many foodies find themselves playing with each other.

That pursuit food of as a mere carnal pleasure or as a status symbol can lead to a dangerous separation from real, crucial food issues at hand — serious issues like health and sustainabliity. If all that we, as foodies, concentrate on is the new hot chef in town or the ultra-expensive kaiseki dinner we ate in Tokyo, we’re missing the risotto for the rice.

That’s not to say that people shouldn’t continue to express themselves via food. After all, it’s as much a valid art form as sculpture, painting or poetry. But would it kill us to be less pretentious about it?

Eat What’s In Your Pantry

Inspired by this recent post on the eGullet forums (and whomever pointed me to this, let me know in the comments section, because I can’t remember who you are!), I decided to take stock of my cupboard, fridge and freezer. The post calls for people to go without shopping for a week and instead live off the bulk of their presumably packed pantries:

Surely I’m not alone in having a freezer and pantry full of food, much of which will get thrown out as it expires over the course of the coming months and years. Indeed, I live in a small apartment. People with houses, basement freezers and walk-in pantries surely have far more of this stuff lying around than I do. Surely I’m not alone in having overbought at the supermarket last week. Surely I’m not alone when I get home from the supermarket and can barely fit the new food in the refrigerator because there’s so much of the old stuff. Surely I’m not alone in being able to skip a week of shopping and still eat well.

So let’s do it again, together. Let’s all skip a week of shopping. Let’s declare national eat the stuff in our freezers and pantries week.

Think about it from an economic standpoint. Times are tough right now. If you spend $100 a week on groceries, this experiment will put $100 back in your pocket quicker than you can say stimulus. If you’re home 50 weeks of the year and you perform this experiment once per quarter, you’ll reduce your grocery bill by 8%.

So this Sunday, I’m not going shopping. And whether you shop on the weekend or on another day, I’m asking you not to shop either. Instead, let’s eat all the stuff we already have around. And let’s talk about it, compare photos, help one another figure out what to do with that jar of giardiniera or that packet of pilaf.

I know plenty of people (Mom, I’m looking at you!) who could comfortably subsist on the contents of their pantries for a week, if not an entire month. In fact, my great-grandmother and great-grandfather — notorious horders who kept huge freezers full of food on their property in anticipation of the next stock market crash or an impending zombie apocalypse — would have stared this challenge in the face and laughed hysterically at it.

I, however, cannot. I’m not an all-at-once kind of shopper. I love Costco, but don’t buy in bulk. Instead, I go shopping nearly every day, buying whatever looks best for dinner that night and grabbing any staples that may have run out. That said, I’ve been…a bit busy lately. Without elaborating, allow me to simply run down a quick list of what is currently residing in my fridge, freezer and nearly-bare cupboard.

Continue reading Eat What’s In Your Pantry

40 Days of Deprivation

I went through a phase in high school where I was Presbyterian.  Although it was more of a social activity — their youth group went to Schlitterbahn! — I still found myself being both confirmed (after a long series of confirmation classes which were at least informative if not particularly spiritual) and baptized in front of the entire congregation one Sunday.

After this, I participated in Lent each year.  And although I didn’t entirely understand why, I solemnly agreed to give up such terrific vices as chocolate or thinking evil thoughts against the trashy girls who left mean notes in my locker.  Of course, in college I was baptized once again — this time in an old-school Church of Christ — since my previous baptism wasn’t considered legitimate.  It seems that no one has seen fit to develop a Euro of Christian rites and rituals, which would be accepted as valid currency from denomination to denomination.

lentThese days, I’m more spiritual — if anything — than I was in high school yet don’t attend church any kind of regular basis.  And I’m not Presbyterian anymore (that second baptism apparently removed all traces of any earlier membership in the church), so I don’t take part in Lent anymore.  That said, I understand and appreciate that other people do.  I’m always fascinated with the intersection of religion and food (hence the recent Joel Osteen vs. bacon article), and Lent is an interesting time of year to ponder what few food-related mandates modern Christian churches still recognize.

If you think about it, modern-day Christians don’t have too many issues around food.  You eat what you want.  There are no dietary restrictions.  Few people fast, and most of those only do so during periods such as the 40 days of Lent.  Compare that to Jewish or Muslim or Hindu faiths, where strict dietary laws mean that what you put into your mouth is just as important as what you harbor in your heart, where feasts such as Eid al-Fitr serve as celebrations of faith and community, and where fasts such as Yom Kippur bring you closer to God through atonement and deprivation.

Lent is one of the few times that Christians look at food through a spiritual lens.  Catholics take the season a bit more seriously than their Protestant cousins, fully abstaining from eating meat on Fridays during the 40 days.  Most people, however, simply decide upon a food or beverage that they’ll give up during the season and go without alcohol or sweets for a little over a month.  These are popular items to give up, hence the popularity of the hedonistic revelry of Mardi Gras or Carneval immediately preceding Lent.

Lent is intended to remind Christians of the 40 days that Jesus Christ spent in the desert wilderness, resisting the temptations that Satan put before him, and to prepare them for the coming festival of Easter and celebration of Christ’s resurrection.  Giving up certain foods or vices or activities is a modern means of resisting temptation, all the while awaiting the glories that lay ahead — both in the form of spiritual glory and in the glory of finally being able to eat chocolate cake or drink a beer again.

Do you celebrate Lent?  If so, what are you giving up this year?  And for those of you that are giving up food or beverage, did you have a final indulgence in your chosen item last night?  Don’t lie…I know at least some of you did.  Spill it below.

Tuesday Trivia: Thursday Edition

Your patience with our much-delayed Tuesday Trivia will be rewarded this week with a shiny new prize! What is that prize? Find out after trivia…

  1. Medieval writers and religious figures took a very broad view on gluttony, arguing that the sin encompassed more than simply over-indulgence in food and beverage. Thomas Aquinas went so far as to prepare a list of six additional ways one could commit gluttony while consuming a meal. What were three of these ways?
  2. Gluttony isn’t the only deadly sin that relates to food. Avarice, or greed, is responsible for driving up the cost of food items worldwide as investors and commodities traders profit from the abject poverty and hunger in countries like the Phillipines, Honduras and Bangladesh. Since 2000, the worldwide price of various oils and fats has risen by 300%, the price of milk by over 150%. By how much has the price of grains gone up since 2000?
  3. People have historically used food as one of many displays of wealth and pride, and still do to this day. Caviar is generally accepted as one of the food items most easily associated with a prideful life. What is the highest grade of Russian caviar on the market? Hint: its name is derived from the Russian word for “little salt.”
  4. Throughout history, people have sought aphrodesiacs to increase their own virility or induce lust in the object of their affections. Which of these foods is not traditionally considered an aphrodesiac: balut, arugula, ginseng, kelp or abalone?
  5. People go to war for many things: religion, land, natural resources. Food (and famine) has been one of the main causes of wrath and wars throughout human history. In fact, most anthropologists now believe that the population of what mysterious island was wiped out after a civil war over food (or, rather, a lack thereof)?
  6. BONUS: Sloth has created a nation (and a world) obsessed with fat-and-calorie-laden fast food and pre-packaged meals. What creation has been widely dubbed the “worst fast food burger” in America, nutritionally-speaking?

Now, obviously, the theme this week was…the Seven Deadly Sins. And the reason for that is two-fold. The first reason is that this week’s prize is one of my all-time favorite food anthropology books, In the Devil’s Garden: A Sinful History of Forbidden Food.

Today’s trivia winner will receive a copy of this truly fascinating book, shipped directly to their front door. I promise that none of today’s questions come from the book, either, so you’re guaranteed a fresh, interesting, eye-opening look at food taboos and food history as it relates to the Western concept of the Seven Deadly Sins.

The second reason is that Randy Rucker will be holding his highly-anticipated Seven Deadly Sins dinner this Monday, October 20th, at Culinaire Catering on Milam. The menu for the night includes seven courses, one for each sin. You don’t want to miss this special tenacity dinner. As always, you can email Randy at rrucker79 at hotmail dot com to RSVP for the dinner. Do it soon! Spots are filling up fast for this one.

Answers (and this week’s winner! — I’m very excited about this!!!) will be announced tomorrow afternoon, so hurry up and get those guesses in before anyone else comes in to crib off you! See you all back here on Friday, bluebirds!

The Omnivore’s Hundred

Cleverley posted this to our Houston Chowhounds board this morning, and I simply couldn’t resist…

From British food writer Andrew Wheeler’s blog, Very Good Taste, comes this list of 100 things that — apparently — you should eat before you die.  Here’s how it works:

1) Copy this list into your blog or journal, including these instructions.
2) Bold all the items you’ve eaten.
3) Cross out any items that you would never consider eating.
4) Optional extra: Post a comment here at www.verygoodtaste.co.uk linking to your results.

There’s not a lot that I won’t eat (so you probably won’t see many strike-throughs).  However, that doesn’t mean there isn’t a lot I haven’t yet had a chance to try.  And I think this list is as good a place as any to start…

  1. Venison
  2. Nettle tea
  3. Huevos rancheros
  4. Steak tartare
  5. Crocodile
  6. Black pudding
  7. Cheese fondue
  8. Carp
  9. Borscht
  10. Baba ghanoush Continue reading The Omnivore’s Hundred

The Sound of Science

Trivia answers coming soon, folks.  Thursday ended up being a bit busier than expected.  For now, enjoy this article about the 2008 Ig Nobel Prize winners, in particular the winner in the Nutrition category who discovered that potato chips taste better when they sound crunchier:

Charles Spence’s award-winning work also has to do with the way the mind functions. Spence, a professor of experimental psychology at Oxford University in England, found that potato chips — “crisps” to the British — that sound crunchier taste better.

His findings have already been put to work at the world-famous Fat Duck Restaurant in England, where diners who purchase one seafood dish also get an iPod that plays ocean sounds as they eat.

Well, of course.  On both counts.  Who wants a non-crunchy potato chip?  More importantly, who doesn’t need a gimmicky iPod to “enhance” their dining experience?  *sigh*

You can read more about the potato chip study (officially titled The Role of Auditory Cues in Modulating the Perceived Crispness and Staleness of Potato Chips — and, really, haven’t we all written a similarly-titled college thesis at some point in our lives?  they just happened to have excellent research to back theirs up, unlike my 2am-Thursday-before-it’s-due physics paper on the incendiary properties of Flaming Hot Cheetos, which was only slightly less well-received than my paper on Greek and Egyptian creation mythology from a Judeo-Christian perspective, entitled Greeks to the North of Me, Egyptians to the South, Here I Am Stuck in the Middle with Jews, which despite being geographically accurate earned me a C- from my very unamused anthropology professor) at The Guardian.

They recruited volunteers who were willing to chew, in a highly regulated way, on Pringles potato crisps. Pringles themselves are, as enthusiasts well know, highly regulated. Each crisp is of nearly identical shape, size and texture, having been carefully manufactured from reconstituted potato goo.

Mmm…potato goo.  Happy Friday!

Black Food Is In

From The Root comes this interesting story about the health benefits of dark-colored food:

Black food is in. And we’re not talking about your grandmother’s fried chicken or Aunt Sadie’s peach cobbler. Instead, it seems that with food, the darker it is, the better it is for you.

Wait…what?  Aside from the really awful “darker the berry, sweeter the juice” jokes that this begs, could that intro have been any more blatantly racist?

I guess since it’s written by a supposedly black author at a black website, it’s okay.  Right?  …not really.

Number One, fried chicken and peach cobbler are no more “black” foods than cornbread and catfish.  Those foods are Southern foods, traditional Southern cuisine.  Not “black” cuisine.  Both blacks and whites in the South eat foods like grits, barbecue, sweet potato pie, okra, field peas and collard greens.  Always have, always will.  What a ridiculous idea that certain foods are “blacker” than others.  Which leads me to…

Number Two, the insinuation that black folks only eat fried chicken or peach cobbler is offensive.  Why not just throw watermelon and pink lemonade into the mix and further stereotype yourself?  Even better, we can go back to old-timey advertisements like this:

Or some classic old Cream of Wheat ads:

While the rest of the article was interesting, that intro almost completely turned me off from reading it.  Nothing like setting a group of people back fifty years or so.  Well done, The Root.

A Big Girl Question

I honestly don’t know if I’m classified as a “big girl” or not.  I hadn’t given it much thought until this very second, when I decided to post this awesome question from one of my favorite websites: Manolo for the Big Girl.

Perhaps I toe the line between “average girl” and “big girl.”  Perhaps it’s all a matter of perception: I shop in the “regular sized” stores and clothing departments, but I’m sure that society probably views me as a “big girl.”  Perhaps clothing designers think that if you’re five-foot-nothing, then you must also be 100 pounds with no discernable breasts, making shopping for clothes more difficult than it really ought to be.  All I know is that I’m not a stick insect, I like to eat, and I’m happy with both of these things.

So, back to my original intent here.  Plumcake and Francesca have such a wonderfully witty way with words (ack!  that’s a lot of alliteration!) and reading Manolo for the Big Girl every morning jump-starts and invigorates me.  Take Plumcake’s excellent description of her lunch a few days ago:

  • Two sliced-up Braeburn apples.
  • One red plum with only most of the sticker removed. Rest of sticker to be discovered between teeth at later date.
  • Odiously hateful organic peanut butter. Technically peanut butter the same way my best friend from college is technically a virgin.
  • One cup peach-flavored probiotic kefir (yogurt’s smug, Nader-voting cousin) mixed with some crunchy sprouted-grain cereal that tastes like angry sweater.
  • And her later description of her coworker’s lunch:

    Is it because she was hauling some sad, Dickensian-looking microwaved meal and wanted to show solidarity?

    Love, people.  Anyway, Plumcake asks an interesting question at the end of her post, which I think merits a read and an answer.  Check it out for yourselves…

    The Big Question: You’re Eating THAT?!

    Thursday Answers…Slightly Delayed

    I apologize for the delay…  I’ve been hard at work on both real, Day Job things and on Houstonist things.  One of those things is scheduled to post today at 4pm on Houstonist, so keep an eye out for it!  I worked my little tail off writing it, editing it, taking pictures for it and formatting it, and am just super excited to finally get it out there!

    Anyway, onto our Thursday answers!  This week’s winner will be announced after the jump:

    1. The cocoa press was the development that led to the possibility and mass production of chocolate in the candy bar form that we all know and love today.  The Dutch chocolatier Conrad van Houten developed the modern cocoa press in an effor to find a way to make his chocolate less oily, so that the beverages would be lighter and easier to drink.  He ended up creating a screw press in 1828 that separated the cocoa butter from the bean itself and creating cocoa powder, both of which we use today!
    2. True, and it created quite a falling out between the two researchers who shared the lab.  Constantin Falberg, a student, was working in the lab of chemist Ira Remsen at Johns Hopkins; the two of them were studying organic chemicals.  One day, Falberg was eating a piece of bread and noticed that it was overpoweringly sweet.  Tracing the sweetness back to the chemicals from the lab, he realized that they’d inadvertently created an artificial sweetner.  Falbert patented the sweetner (which he called saccharin) without the knowledge of his professor, Remsen, which created a lifelong rivalry and rift between them.
    3. Samp and hominy, at various points, been used to refer to grits.  Samp is actually dried corn kernels, which have been broken down but not to a fine meal.  Hominy is actually dried corn which has been soaked in lye to remove the hulls and soften the corn until it’s palatable.  And grits are the greatest food mankind has ever known.
    4. Believe it or not, there was a citywide epidemic of rickets among the children of Dublin when the city was restricted to eating whole-grain bread.  Why?  The whole-grain bread contained such low amounts of calcium and such high amounts of bran (which further blocks calcium absorption by the body), that the children developed rickets as a result of exaggerated calcium deficiency.  Just goes to show that too much of anything — even a good thing — can be a bad thing.
    5. Both wheat and barley were domesticated before any other cereal grain, including rice and corn (4500 B.C.), millet and sorghum (4000 B.C.) and oats (circa 100 A.D.).  Although wheat and barley were both of great importance to ancient civilizations, only wheat has retained that popularity.  In the west, barley is used primarily as animal feed and for producing beer.  It’s a shame, because there’s nothing like a big bowl of hot barley with stewed tomatoes, onions and garlic.  Ask my mother sometime; she’ll make you a bowl.
    6. BONUS:  Wheat and barley were both originally cultivated around 7000 B.C.  Around the same time, humans were also finally figuring out that they could domesticate animals, including goats, pigs and camels.

    So, who won?  Find out after the jump…

    Continue reading Thursday Answers…Slightly Delayed