Essays about game development, thinking and books

Grainau: hiking and beer at 3000 meters

How it all looks from the ground.

How it all looks from the ground.

For her vacation, Yuliya decided to show me the beautiful German mountains and took me for a couple of days to Grainau — it's a piece of Bavaria that's almost like Switzerland. At least, it is similar to the pictures of Switzerland that I've seen :-D

In short, it's a lovely place with a measured pace of life. If you need to catch your breath, calm your nerves, and enjoy nature, then this is the place for you. But if you can't live without parties, you'll get bored quickly.

What's there:

  • The highest mountain in Germany plus a couple of glaciers.
  • There's skiing in winter. If you really need it, you can find a place to ski in summer, but the descent is short, and the lifts are turned off.
  • A large clean lake and a couple of smaller ones.
  • A huge number of trails for hiking.
  • A huge number of waterfalls, streams, and a couple of mountain rivers.
  • Restaurants with beer.
  • Beautiful fallen trees in the forests, private property, fences, cows with bells, and "racing tractors" (I don't know how to name this phenomenon better, but tractors are moving fast there :-D).

This is briefly, and now in detail.

Read more

Migrating from GPT-3.5-turbo to GPT-4o-mini

Guess when I switched models.

Guess when I switched models.

Recently OpenAI released GPT-4o-mini — a new flagship model for the cheap segment, as it were.

  • They say it works "almost like" GPT-4o, sometimes even better than GPT-4.
  • It is almost three times cheaper than GPT-3.5-turbo.
  • Context size 128k tokens, against 16k for GPT-3.5-turbo.

Of course, I immediately started migrating my news reader to this model.

In short, it's a cool replacement for GPT-3.5-turbo. I immediately replaced two LLM agents with one without changing prompts, reducing costs by a factor of 5 without losing quality.

However, then I started tuning the prompt to make it even cooler and began to encounter nuances. Let me tell you about them.

Read more

Review of the book "The Signal and the Noise"

The cover of the book "The Signal and the Noise".

Nate Silver — the author of "The Signal and the Noise" — is widely known for his successful forecasts, such as the US elections. It is not surprising that the book became a bestseller.

As you might guess, the book is about forecasts. More precisely, it is about approaches to forecasting, complexities, errors, misconceptions, and so on.

As usual, I expected a more theoretical approach, in the spirit of Scale [ru], but the author chose a different path and presented his ideas through the analysis of practical cases: one case per chapter. Each chapter describes a significant task, such as weather forecasting, and provides several prisms for looking at building forecasts. This certainly makes the material more accessible, but personally, I would like more systematics and theory.

Because of the case studies approach, it isn't easy to make a brief summary of the book. It is possible, and it would even be interesting to try, but the amount of work is too large — the author did not intend to provide a coherent system or a short set of basic theses.

Therefore, I will review the book as a whole, provide an approximate list of prisms, and list some cool facts.

Read more

My GPTs and prompt engineering

Ponies are doing prompt engineering (c) DALL-E

Ponies are doing prompt engineering (c) DALL-E

I've been using ChatGPT almost since the release of the fourth version (so for over a year now). Over this time, I've gotten pretty good at writing queries to this thing.

At some point, OpenAI allowed customizing chats with your text instructions (look for Customize ChatGPT in the menu). With time, I added more and more commands there, and recently, the size of the instructions exceeded the allowed maximum :-)

Also, it turned out that a universal instruction set is not such a good idea — you need to adjust instructions for different kinds of tasks, otherwise, they won't be as useful as they could be.

Therefore, I moved the instructions to GPT bots instead of customizing my chat. OpenAI calls them GPTs. They are the same chats but with a higher limit on the size of the customized instructions and the ability to upload additional texts as a knowledge base.

Someday, I'll make a GPT for this blog, but for now, I'll tell you about two GPTs I use daily:

For each, I'll provide the basic prompt with my comments.

By the way, OpenAI recently opened a GPT store, I'd be grateful if you liked mine GPTs. Of course, only if they are useful to you.

Read more

«Slay The Princess» — combinatorial narrative

My favorite version of the Princess.

My favorite version of the Princess.

It's hard to impress me as a player and even harder as a game developer. The last time it happened with Owlcat Games in Pathfinder: Kingmaker, when they added a timer to the game's plot.

But Black Tabby Games managed to do it. And they did it not with some technological complexity but with a visual novel on a standard engine (RenPy), which is cool in itself.

I'll share a couple of thoughts about the game and its narrative structure, while I'm still under the impression. I need to think about how to adapt this approach to my projects.

ATTENTION: SPOILERS!

If you haven't played Slay The Princess yet, I strongly recommend you to catch up — the game takes 3-4 hours. You'll not regret it.

Read more