Kitchen tranquility

A small story about our frustration with modern recipe websites and how an LLM helps us filter through the noise

Now I really like to cook, ideally when I have time. Alas, with two kids and us both having day jobs, finding sufficient time to cook seems to be almost an impossible quest. When I finally secured a moment to retreat in the kitchen and get my hands dirty, all I want is to fully immerse myself in the culinary journey.

We have plenty of Ottolenghi’s shelved in our kitchen, and I am sure I could quickly find a masterful recipe just a couple of page turns away. However, for some reason I have developed an unhinged habit of relying more and more on Google for inspiration.

Now as I’m sure you all know, internet recipes have enshittified to the point where they are practically unusable. For those of you that live entirely on Uber Eats and never touch a pan, attempting to read a typical internet recipe goes something like this:

  1. Trying to find the ‘reject all’ button on the full-screen privacy banner (usually not a successful quest, so proceed with signing your rights away).
  2. Clicking away the advertisement from some random electrical appliance that we bought two weeks ago.
  3. The actual page loads. First, you get served an unnecessarily long treatise on some topic completely unrelated to the recipe. About travel, health, family life, etc. It always amazes me how bizarrely little information can be transferred in so many words.
  4. Then, when the actual recipe is found, scrolling from ‘ingredients’ to ‘instructions’ (which I do all the time) can also be a pain, given that this is the perfect place for more ads.

Turning panic into magic

It was with this mindset that I decided back in March to explore if an LLM could be utilized to sift through the noise. I had a rough idea of the flow:

flowchart TD A[Google search] --> B[Click on recipe link] B --> C[Share page with OpenAI] C --> D[Receive filtered recipe] D --> E[Store in Catalog]

The most straightforward way appeared to be to use the OpenAI API endpoint, and send it the full website text preceded by a simple prompt:

The given website contains a recipe. Please look at this website, and extract ONLY the recipe, no additional text or stories! Format your response in .md format. Do not return the markdown block indicators (accents). Strictly adhere to the following syntax, and keep your answer to the original language:

Title of recipe

Category of the recipe, using only tag-notation! Choice of tags is strictly limited to the following list: #fish, #seafood, #chicken, #beef, #pork, #eggs, #cheese, #lentils, #tofu, #bread, #potatoes, #cereals, #salad, #soup, #vegetables, #pasta, #beans, #rice, #diary, #vegan, #vegetarian, #spicy, #oven, #asian, #italian, #mediterranean, #french. Often multiple tags are required to categorize the dish, please check this very carefully!
Total cooking time using tags. Only respond with one of the following options: #15m, #30min, #45min, #1hr, #1.5hr, #2+hrs, whichever is most approximate. No additional text.
For XX persons (if found in the website text, if not, leave empty)

Ingredients

Instructions

So far the results have been surprisingly stable, despite what absolute slop websites you throw at it. It works fine with the cheap GPT-4o, since no reasoning or long context windows are required. I have not found any clear hallucinations, but I am almost sure they will be there.

The actual API call is initiated from my iPhone, via an Apple shortcut. After a bit of trial and error I got a rough version working. For those of you not afraid to experiment, I have included a download link in this footnote: 1 (use at your own risk).

To have this Shortcut appear in the “Share sheet”, you would need to:

  1. Open the Shortcuts app
  2. Tap the three dots on the Shortcut
  3. Tap the ⓘ icon to open its Details/Settings
  4. Enable “Show in Share Sheet”
  5. Tap “Share Sheet Types” and specify “Url”

Our Recipe Catalog

Since the prompt returned markdown formatted text, I told my Apple shortcut to save the output as a .md file in our shared iCloud folder. From there, we use Obsidian on our phone and laptops to read the files. Although I do not really like the UX of Obsidian on iOS, it works reasonably well and you can categorize based on the #tags.

And after all, the output is simply a folder of .md files, so when a better platform comes along we can simply copy these over.

In order to have all recipes in one location, I also built a variant of the flow that accepts photos instead of a URL. That allows us to simply scan pages of a book, and generate the same recipe format. Again a link in the footnote: 2

Next step, generate a shopping list

We have a great local organic street market here in town every Saturday. It’s a joy to visit, but not always too efficient given that they only have the healthy stuff (veggies, fruit, cheese, meat) and we still need to go to the supermarket afterwards.

Luckily, here also an LLM can help out. Building a quick ‘week menu’ in Obsidian is just a matter of creating a blank page and linking .md files from our recipe catalog. Then we shoot this entire text blob to GPT-4o with the following prompt:

Herebelow follow a handful of recipes. I want to go shopping and buy all ingredients for these recipes. Can you combine all ingredients into one big list? Please do not mention oils, onions, garlic, turmeric, cumin powder, coriander powder, ground mace, fenugreek powder, sugar and soy sauce, because we always have plenty of that in the house.

I will first go to the fruit and vegetables market, so please start your list with that. Then I will go to the supermarket for the remainder.

Please format your response in Markdown (.md) and use unchecked checkboxes for each item that I need to buy. Do not include the markdown block indicators (accents)

Even though this is just a simple project, and -yes- we should not leave those Ottolenghi books untouched, I find it amazing that we now have technology that enables this. There is so much more to be automated. And yes, generative AI will almost certainly clutter the internet as we know it with slop (which is a whole different problem altogether) but at least for now, LLMs are also a great tool to filter through the noise.